Session 1.6 slovak public metadata governance and management based on linke...semanticsconference
This document proposes establishing public linked data governance and management in the Slovak Republic based on methodologies used by EU institutions. It outlines establishing rules for interoperability levels of open public data, creating a central ontological model and governance structure to manage data quality and interoperability. It also proposes a linked data management lifecycle to publish, deploy, manage changes to and retire ontologies and URIs according to a change request process in order to establish central governance of public metadata in Slovakia.
A distributed network of digital heritage information - Semantics AmsterdamEnno Meijers
This document discusses strategies for improving discovery of digital heritage information across Dutch cultural institutions. It identifies problems with the current infrastructure based on OAI-PMH including lack of semantic alignment and inefficient data integration. The proposed strategy is to build a distributed network based on Linked Data principles, with a registry of organizations and datasets, a knowledge graph with backlinks to support resource discovery, and virtual data integration using federated querying of Linked Data sources. This will improve usability, visibility, and sustainability of digital heritage information in the Netherlands.
Presenter: Stuart Macdonald
Presentation first given at Open Knowledge Scotland event at Inspace in Edinburgh, 13 May 2010.
EDINA project to create an online crowdsourcing tool which will combine data from digitised Scottish Post Office Directories (PODs) with contemporaneous historical maps
DBpedia: A Public Data Infrastructure for the Web of DataSebastian Hellmann
The document discusses the DBpedia project, which extracts structured data from Wikipedia to build a multilingual knowledge graph. It describes DBpedia's goals of making this data openly available and supporting its community. The DBpedia Association is being formed as a non-profit to oversee the infrastructure and support contributors. Funding will come from donations and sponsorships. Upcoming events include the DBpedia Community Meeting coinciding with the SEMANTiCS conference in September.
This document discusses developing a distributed network of digital heritage information in the Netherlands. It proposes taking a resource-centric linked data approach, implementing linked data principles in data sources, building a knowledge graph, and creating a registry to link organizations, datasets, and resources. This would allow for federated querying across distributed data sources and improved discovery of digital heritage information.
Linked Data allows evolving the web into a global data space by publishing structured data on the web using RDF and by linking data items across different data sources. It follows the Linked Data principles of using URIs to identify things and HTTP URIs to look up those names, providing useful RDF information when URIs are dereferenced, and including RDF links to discover related data. The amount of published Linked Data on the web has grown enormously since 2007. Large data sources like DBpedia extract structured data from Wikipedia and act as hubs by interlinking different data sets, enabling new applications and search over integrated data.
Session 1.6 slovak public metadata governance and management based on linke...semanticsconference
This document proposes establishing public linked data governance and management in the Slovak Republic based on methodologies used by EU institutions. It outlines establishing rules for interoperability levels of open public data, creating a central ontological model and governance structure to manage data quality and interoperability. It also proposes a linked data management lifecycle to publish, deploy, manage changes to and retire ontologies and URIs according to a change request process in order to establish central governance of public metadata in Slovakia.
A distributed network of digital heritage information - Semantics AmsterdamEnno Meijers
This document discusses strategies for improving discovery of digital heritage information across Dutch cultural institutions. It identifies problems with the current infrastructure based on OAI-PMH including lack of semantic alignment and inefficient data integration. The proposed strategy is to build a distributed network based on Linked Data principles, with a registry of organizations and datasets, a knowledge graph with backlinks to support resource discovery, and virtual data integration using federated querying of Linked Data sources. This will improve usability, visibility, and sustainability of digital heritage information in the Netherlands.
Presenter: Stuart Macdonald
Presentation first given at Open Knowledge Scotland event at Inspace in Edinburgh, 13 May 2010.
EDINA project to create an online crowdsourcing tool which will combine data from digitised Scottish Post Office Directories (PODs) with contemporaneous historical maps
DBpedia: A Public Data Infrastructure for the Web of DataSebastian Hellmann
The document discusses the DBpedia project, which extracts structured data from Wikipedia to build a multilingual knowledge graph. It describes DBpedia's goals of making this data openly available and supporting its community. The DBpedia Association is being formed as a non-profit to oversee the infrastructure and support contributors. Funding will come from donations and sponsorships. Upcoming events include the DBpedia Community Meeting coinciding with the SEMANTiCS conference in September.
This document discusses developing a distributed network of digital heritage information in the Netherlands. It proposes taking a resource-centric linked data approach, implementing linked data principles in data sources, building a knowledge graph, and creating a registry to link organizations, datasets, and resources. This would allow for federated querying across distributed data sources and improved discovery of digital heritage information.
Linked Data allows evolving the web into a global data space by publishing structured data on the web using RDF and by linking data items across different data sources. It follows the Linked Data principles of using URIs to identify things and HTTP URIs to look up those names, providing useful RDF information when URIs are dereferenced, and including RDF links to discover related data. The amount of published Linked Data on the web has grown enormously since 2007. Large data sources like DBpedia extract structured data from Wikipedia and act as hubs by interlinking different data sets, enabling new applications and search over integrated data.
DBpedia is a crowd-sourced effort to extract structured data from Wikipedia and Wikidata. It provides a public SPARQL endpoint to query this multi-domain, multilingual dataset. The DBpedia Association was founded in 2014 as a non-profit to oversee DBpedia and aims to improve uptime, data quality, and integration with other sources. It relies on funding and contributions from members to achieve goals like 99.99% uptime across languages and domains. The document promotes joining the DBpedia Association and participating in future events like a DBpedia meeting at the SEMANTiCS 2016 conference.
Wikidata is a free and open knowledge base that can be edited by anyone to store structured data. It currently has over 33.5 million articles and 1.9 billion edits in 287 languages. Wikidata provides structured, collaborative, free, open, multilingual, and referenced data through its API and licenses its data under CC0 to allow easy access and reuse. It helps projects like Wikipedia by providing integrated access to its data and supports smaller languages and communities through micro-contributions. In 2015, Google's Freebase project moved its data to Wikidata, increasing its scope and ecosystem.
The document discusses open educational resources (OER) and the MANTRA project. It defines OER as teaching, learning, and research materials that are freely available or have an open license allowing free use. The MANTRA project aims to create online learning modules about research data management and make them available as OER. Key lessons from creating the modules included underestimating the time needed, challenges of authoring content, and ensuring consistency across materials.
DYAS: The Greek Research Infrastructure Network for the Humanitiesariadnenetwork
Presentation by:
Panos Constantopoulos
Athens University of Economics and Business,
Athena Research Centre
Costis Dallas
Toronto University,
Panteion University,
Athena Research Centre
Presenter: Dimitris Gavrilis
Full-day session on archaeological infrastructures and services at the 18th Cultural Heritage and New Technologies (CHNT) conference
Vienna, Austria
11th -13th November 2013
This document discusses the interaction between digital libraries and digital humanities. It begins by defining digital libraries and digital humanities, noting that digital libraries have advanced digitization efforts and made data more open and reusable, which supports digital humanities work. Digital humanities drives new approaches and questions in digital libraries. The document then discusses specific examples of digital library and digital humanities collaboration in Serbia and through organizations like DARIAH and LIBER. It concludes by questioning whether the relationship between digital libraries and digital humanities will be a long-term partnership or a short-term convenience.
The document discusses persistent identifiers and DataCite, a global consortium that provides standards and best practices for citing and identifying research datasets and other non-textual materials. It describes how DataCite assigns long-lasting digital object identifiers (DOIs) to datasets, along with metadata, to allow for permanent citation and discovery of research data on the internet. The benefits of DOIs include easy and permanent access to research data online and increased citation of datasets, helping data to be recognized as a valid research output.
Europeana and schema.org
Presentation at the Dublin Core conference, special session on Schema.org, Sept 5, 2013.
Conference site: http://dcevents.dublincore.org/index.php/IntConf/dc-2013/
Harvesting Repositories: DPLA, Europeana, & Other Case Studieseohallor
Join this discussion on the benefits and process of harvesting to aggregators such as DPLA, Europeana and other aggregators. Through case studies we'll outline three stages of the process, including 1) mapping, migrating, and normalizing data in open source digital repositories, 2) making use of the Open Archives Initiative Protocol for Metadata Harvesting (OAI - PMH), and 3) reaping the benefits of increased exposure. Presenters welcome lively discussion and questions from participants of all technical backgrounds and skill levels.
This document summarizes a presentation on using digital audio archives to promote performance studies. It discusses two projects - the Baudelaire Song Project and Visualising Voice. The Baudelaire Song Project analyzes French art songs set to the poetry of Baudelaire over four years with AHRC funding. Visualising Voice uses a Europeana Research Award to create a public-facing web interface for digital audio analysis. Both projects use open-access digital archives but face challenges regarding language barriers, audio quality, copyright and data storage.
Wednesday 6 May: Hand me the data! What you should know as a humanities resea...WARCnet
Wednesday 6 May: Hand me the data! What you should know as a humanities researcher before asking for data from a web archive, Ulrich Have, NetLab/DIGHUMLAB, Aarhus University
The Wellcome Trust is examining the possibility of a cloud platform for the storage and delivery of digitised artefacts. This platform is intended for the Trust's own use as well as others. A version of this presentation with embedded notes and video can be viewed on Google docs: http://bit.ly/1GRKqN4 or PowerPoint online: http://bit.ly/1CwGsrE
Wikidata, a target for Europeana's semantic strategy - GLAM-WIKI 2015Antoine Isaac
"Wikidata, a target for Europeana's semantic strategy"/ Presentation at the GLAM-Wiki conference with Valentine Charles, Hugo Manguinhas, Antoine Isaac, Vladimir Alexiev http://nl.wikimedia.org/wiki/GLAM-WIKI_2015/
d:swarm - A Library Data Management Platform Based on a Linked Open Data Appr...Jens Mittelbach
D:SWARM is a graphical web-based ETL modelling tool that serves to import data from heterogeneous sources with different formats, to map input to output schemata and design transformation workflows, to load transformed data into property graph database. It is developed in a collaborative project by SLUB Dresden (www.slub-dresden.de) and Avantgarde Labs GmbH (www.avantgarde-labs.de) features additional functionalities like exporting of data models as RDF and sharing mappings and transformation workflows.
Rebecca Grant, Kathryn Cassidy, Marta Bustillo - Implementing Orphan Works Le...dri_ireland
Presentation made by Rebecca Grant (Digital Repository of Ireland) Kathryn Cassidy (Digital Repository of Ireland) and Marta Bustillo (Trinity College Dublin) at Open Repositories, Dublin on 14 June 2016. The presentation gives an overview of the EU Orphan Works Directive and its implementation in Ireland, and discusses how the Digital Repository of Ireland adapted its workflows and UI to allow the publication of registered Orphan Works.
CEMEC Discovery Programme discussion digital heritageMarco Streefkerk
The document summarizes a meeting about the Digital Heritage Netherlands (DEN) Foundation and its CEMEC project. DEN is the Dutch knowledge hub for digital culture that supports good practices for digitization at cultural heritage institutions. Its core mission is to share knowledge and experiences about technology and practices for digital heritage. DEN encourages institutions to invest in open technologies, create sustainable services, and use common standards to jointly create a national Digital Cultural Collection. The CEMEC project aims to establish shared digital heritage services at a national level in the Netherlands.
New approaches for data acquisition at europeana iiif, sitemaps and schema.o...Nuno Freire
Presentation on experiments at Europeana regarding new methods of aggregating metadata.
Presented at the Seminar Linked Data in Research and Cultural Heritage, on 1st of May 2017.
Estermann Wikidata and Heritage Data 20170914Beat Estermann
This document discusses Wikidata and cultural heritage data. It aims to establish Wikidata as a central hub for cultural heritage data by ingesting related data and enhancing it. Key challenges include getting institutions to provide open data, assisting with data scraping, addressing coverage biases, mapping data models during ingestion, and dealing with incorrect data. Maintaining data quality over time through processes like updating and dispute resolution is also challenging. The document explores how Wikidata can better integrate with other databases and cultural heritage organizations to maximize data sharing and reuse.
Presentation at the Education Session of the American Art Collaborative (AAC) Linked Open Data Initiative, 31 March 2015. http://americanartcollaborative.org/
Slides for Culture Hack panel @SXSW2013 : http://schedule.sxsw.com/2013/events/event_IAP4580
Some slides re-used from Harry Verwayen (http://www.slideshare.net/hverwayen/business-model-innovation-open-data) and Julia Fallon
DBpedia is a crowd-sourced effort to extract structured data from Wikipedia and Wikidata. It provides a public SPARQL endpoint to query this multi-domain, multilingual dataset. The DBpedia Association was founded in 2014 as a non-profit to oversee DBpedia and aims to improve uptime, data quality, and integration with other sources. It relies on funding and contributions from members to achieve goals like 99.99% uptime across languages and domains. The document promotes joining the DBpedia Association and participating in future events like a DBpedia meeting at the SEMANTiCS 2016 conference.
Wikidata is a free and open knowledge base that can be edited by anyone to store structured data. It currently has over 33.5 million articles and 1.9 billion edits in 287 languages. Wikidata provides structured, collaborative, free, open, multilingual, and referenced data through its API and licenses its data under CC0 to allow easy access and reuse. It helps projects like Wikipedia by providing integrated access to its data and supports smaller languages and communities through micro-contributions. In 2015, Google's Freebase project moved its data to Wikidata, increasing its scope and ecosystem.
The document discusses open educational resources (OER) and the MANTRA project. It defines OER as teaching, learning, and research materials that are freely available or have an open license allowing free use. The MANTRA project aims to create online learning modules about research data management and make them available as OER. Key lessons from creating the modules included underestimating the time needed, challenges of authoring content, and ensuring consistency across materials.
DYAS: The Greek Research Infrastructure Network for the Humanitiesariadnenetwork
Presentation by:
Panos Constantopoulos
Athens University of Economics and Business,
Athena Research Centre
Costis Dallas
Toronto University,
Panteion University,
Athena Research Centre
Presenter: Dimitris Gavrilis
Full-day session on archaeological infrastructures and services at the 18th Cultural Heritage and New Technologies (CHNT) conference
Vienna, Austria
11th -13th November 2013
This document discusses the interaction between digital libraries and digital humanities. It begins by defining digital libraries and digital humanities, noting that digital libraries have advanced digitization efforts and made data more open and reusable, which supports digital humanities work. Digital humanities drives new approaches and questions in digital libraries. The document then discusses specific examples of digital library and digital humanities collaboration in Serbia and through organizations like DARIAH and LIBER. It concludes by questioning whether the relationship between digital libraries and digital humanities will be a long-term partnership or a short-term convenience.
The document discusses persistent identifiers and DataCite, a global consortium that provides standards and best practices for citing and identifying research datasets and other non-textual materials. It describes how DataCite assigns long-lasting digital object identifiers (DOIs) to datasets, along with metadata, to allow for permanent citation and discovery of research data on the internet. The benefits of DOIs include easy and permanent access to research data online and increased citation of datasets, helping data to be recognized as a valid research output.
Europeana and schema.org
Presentation at the Dublin Core conference, special session on Schema.org, Sept 5, 2013.
Conference site: http://dcevents.dublincore.org/index.php/IntConf/dc-2013/
Harvesting Repositories: DPLA, Europeana, & Other Case Studieseohallor
Join this discussion on the benefits and process of harvesting to aggregators such as DPLA, Europeana and other aggregators. Through case studies we'll outline three stages of the process, including 1) mapping, migrating, and normalizing data in open source digital repositories, 2) making use of the Open Archives Initiative Protocol for Metadata Harvesting (OAI - PMH), and 3) reaping the benefits of increased exposure. Presenters welcome lively discussion and questions from participants of all technical backgrounds and skill levels.
This document summarizes a presentation on using digital audio archives to promote performance studies. It discusses two projects - the Baudelaire Song Project and Visualising Voice. The Baudelaire Song Project analyzes French art songs set to the poetry of Baudelaire over four years with AHRC funding. Visualising Voice uses a Europeana Research Award to create a public-facing web interface for digital audio analysis. Both projects use open-access digital archives but face challenges regarding language barriers, audio quality, copyright and data storage.
Wednesday 6 May: Hand me the data! What you should know as a humanities resea...WARCnet
Wednesday 6 May: Hand me the data! What you should know as a humanities researcher before asking for data from a web archive, Ulrich Have, NetLab/DIGHUMLAB, Aarhus University
The Wellcome Trust is examining the possibility of a cloud platform for the storage and delivery of digitised artefacts. This platform is intended for the Trust's own use as well as others. A version of this presentation with embedded notes and video can be viewed on Google docs: http://bit.ly/1GRKqN4 or PowerPoint online: http://bit.ly/1CwGsrE
Wikidata, a target for Europeana's semantic strategy - GLAM-WIKI 2015Antoine Isaac
"Wikidata, a target for Europeana's semantic strategy"/ Presentation at the GLAM-Wiki conference with Valentine Charles, Hugo Manguinhas, Antoine Isaac, Vladimir Alexiev http://nl.wikimedia.org/wiki/GLAM-WIKI_2015/
d:swarm - A Library Data Management Platform Based on a Linked Open Data Appr...Jens Mittelbach
D:SWARM is a graphical web-based ETL modelling tool that serves to import data from heterogeneous sources with different formats, to map input to output schemata and design transformation workflows, to load transformed data into property graph database. It is developed in a collaborative project by SLUB Dresden (www.slub-dresden.de) and Avantgarde Labs GmbH (www.avantgarde-labs.de) features additional functionalities like exporting of data models as RDF and sharing mappings and transformation workflows.
Rebecca Grant, Kathryn Cassidy, Marta Bustillo - Implementing Orphan Works Le...dri_ireland
Presentation made by Rebecca Grant (Digital Repository of Ireland) Kathryn Cassidy (Digital Repository of Ireland) and Marta Bustillo (Trinity College Dublin) at Open Repositories, Dublin on 14 June 2016. The presentation gives an overview of the EU Orphan Works Directive and its implementation in Ireland, and discusses how the Digital Repository of Ireland adapted its workflows and UI to allow the publication of registered Orphan Works.
CEMEC Discovery Programme discussion digital heritageMarco Streefkerk
The document summarizes a meeting about the Digital Heritage Netherlands (DEN) Foundation and its CEMEC project. DEN is the Dutch knowledge hub for digital culture that supports good practices for digitization at cultural heritage institutions. Its core mission is to share knowledge and experiences about technology and practices for digital heritage. DEN encourages institutions to invest in open technologies, create sustainable services, and use common standards to jointly create a national Digital Cultural Collection. The CEMEC project aims to establish shared digital heritage services at a national level in the Netherlands.
New approaches for data acquisition at europeana iiif, sitemaps and schema.o...Nuno Freire
Presentation on experiments at Europeana regarding new methods of aggregating metadata.
Presented at the Seminar Linked Data in Research and Cultural Heritage, on 1st of May 2017.
Estermann Wikidata and Heritage Data 20170914Beat Estermann
This document discusses Wikidata and cultural heritage data. It aims to establish Wikidata as a central hub for cultural heritage data by ingesting related data and enhancing it. Key challenges include getting institutions to provide open data, assisting with data scraping, addressing coverage biases, mapping data models during ingestion, and dealing with incorrect data. Maintaining data quality over time through processes like updating and dispute resolution is also challenging. The document explores how Wikidata can better integrate with other databases and cultural heritage organizations to maximize data sharing and reuse.
Presentation at the Education Session of the American Art Collaborative (AAC) Linked Open Data Initiative, 31 March 2015. http://americanartcollaborative.org/
Slides for Culture Hack panel @SXSW2013 : http://schedule.sxsw.com/2013/events/event_IAP4580
Some slides re-used from Harry Verwayen (http://www.slideshare.net/hverwayen/business-model-innovation-open-data) and Julia Fallon
Linked Data (1st Linked Data Meetup Malmö)Anja Jentzsch
This document discusses Linked Data and outlines its key principles and benefits. It describes how Linked Data extends the traditional web by creating a single global data space using RDF to publish structured data on the web and by setting links between data items from different sources. The document outlines the growth of Linked Data on the web, with over 31 billion triples from 295 datasets as of 2011. It provides examples of large Linked Data sources like DBpedia and discusses best practices for publishing, consuming, and working with Linked Data.
European databases in cultural heritage: making connectionsCARARE
This document summarizes information about several European databases and initiatives for sharing cultural heritage data online. It introduces CARARE, which helps institutions share digital content with Europeana. It then discusses Europeana, a platform for over 50 million digital cultural heritage items, including 1.5 million archaeology items. The document outlines challenges of aggregating data from different sources and standards into Europeana, and how CARARE and other aggregators work to map metadata into a common format. It also introduces the ARIADNE Plus research infrastructure, which aims to support archaeology researchers through an online catalogue of datasets and related services and tools.
Europeana and the Mediterranean Region by Dov Winer
Presentation at the GID Parmenides Conference
Towards a Mediterranean Science Area
Mediterranean Wealth and Diversity: Biology and Culture
at the Bibliotheca Alexandrina, Alexandria 21-24 June 2010
- Europeana is a digital library system that provides access to cultural heritage collections across Europe through APIs and a portal.
- The Europeana Semantic Elements model is currently used but the Europeana Data Model is being developed to better preserve original metadata while enabling interoperability.
- The Europeana Data Model presentation described the EDM, which is based on standards like OAI ORE, Dublin Core, and SKOS to organize object metadata from different providers in a semantic web framework. It allows distinction between objects and records while supporting complex objects and vocabularies.
CLARIAH Toogdag 2018: A distributed network of digital heritage informationEnno Meijers
Slides of my keynote at the CLARIAH Toogdag 2018 on 9 March at the National Library of the Netherlands. The main topics were the development of the distributed digital heritage network and the alignment to and cooperation with the CLARIAH infrastructure and data. It also points at some of the current limitations of the semantic web technology.
Europeana as a Linked Data (Quality) caseAntoine Isaac
Presentation for the 3rd Workshop on Humanities in the Semantic Web (WHiSe), co-located with the 15th Extended Semantic Web Conference (ESWC 2020)
June 2, 2020, online
http://whise.cc/2020/
The Europeana Strategy and Linked Open DataDavid Haskiya
The document discusses Europeana's strategy for 2015-2020 and how linked open data and linked open data technologies will help realize this strategy. Key points:
- Europeana's strategy is to transition from metadata to graphs and from strings to things by making data and APIs more linked and open.
- Linked open data allows data from different sources to be combined and helps make content more findable on search engines and in knowledge panels.
- Europeana labs provides APIs, tools, documentation and data to help partners publish linked open data that can be reused in the Europeana portal and other applications.
This document summarizes a talk about whether repositories are disruptive or disrupted technology. It discusses how repositories have evolved from focusing on individual institutions to being more distributed, collaborative, web-oriented and interoperable. It argues repositories should be integrated into broader information infrastructure and expose digital content for reuse on the web using standards like OAI-ORE. The document also summarizes the history of the Fedora repository project's approaches to interoperability and its plans to adopt more common web APIs and connect backend storage to cloud services. The goal is to make repositories more open and integrated parts of the global networked environment rather than closed local systems.
An introduction to the Europeana Data Model and services in the context of creating benchmarks for a cultural heritage data set. Presented at the Linked Data Benchmark Council Technical User Committee in London in November 2013.
December 2, 2015: NISO/NFAIS Virtual Conference: Semantic Web: What's New and...DeVonne Parks, CEM
This document discusses Europeana's use of semantic web technologies and linked data to improve access to cultural heritage collections. It summarizes that Europeana aggregates metadata from various cultural institutions to provide access to over 48 million digitized objects. It has implemented the Europeana Data Model to represent metadata in a more granular, semantically linked way using vocabularies like GeoNames, DBpedia, and AAT. This has enabled automatic enrichment of metadata as well as multilingual and conceptual searching. Linked open data approaches provide technical and strategic benefits to Europeana by facilitating data sharing and enrichment across domains.
Fondly Collisions: Archival hierarchy and the Europeana Data Model Valentine Charles
This document discusses representing archival hierarchies in Europeana using the Europeana Data Model (EDM). It provides an example of converting a finding aid encoded in EAD to EDM to represent the hierarchical structure. Remaining challenges include representing hierarchies when metadata or digital representations are missing for certain levels. Publishing hierarchical data for both developers and end users is discussed.
The document discusses the data era of massive information production and challenges of extracting knowledge from data. It describes the growth of digital data and potential economic value of big data. Both syntactic approaches like visualizations and semantic approaches using structured data are needed to help humans and machines understand and make use of large amounts of data. Linked open data and open government data initiatives are helping to make large data sources structured and interconnected on the web.
Connecting the Dots: Linking Digitized Collections Across Metadata SilosOCLC
This document summarizes a presentation about linking digitized collections across metadata silos. It discusses how projects like Europeana and the Digital Public Library of America have struggled to rationalize aggregated data. To better share data within and across organizations, standards and best practices need to be applied universally to connect related items and allow data to be consumed by both humans and machines. The presentation advocates for publishing data as linked open data using identifiers and schemas like Schema.org to form a knowledge graph and improve discoverability on the web.
Session 1.4 a distributed network of heritage informationsemanticsconference
This document discusses strategies for improving discovery of digital heritage information across Dutch cultural institutions. It identifies problems with the current infrastructure based on OAI-PMH including lack of semantic alignment and inefficient data integration. The proposed strategy is to build a distributed network based on Linked Data principles, with a registry of organizations and datasets, a knowledge graph with backlinks to support resource discovery, and virtual data integration using federated querying of Linked Data sources. This will improve usability, visibility, and sustainability of digital heritage information in the Netherlands.
Europeana is a service that aggregates metadata from cultural heritage institutions across Europe, making over 30 million objects accessible online. It uses the Europeana Data Model to standardize metadata in a way that balances granularity and compatibility with existing standards. The EDM defines classes for provided cultural works, related agents, concepts, and places to provide richer semantic descriptions. Europeana makes this metadata and links to digital objects freely available via its website and API to promote open access to cultural heritage.
A distributed network of digital heritage information by Enno Meijers - Europ...Europeana
The document discusses the Digital Heritage Network (NDE) in the Netherlands, which aims to increase access to digital heritage information by developing a distributed network. It outlines the NDE's three-layered approach focusing on sustainability, usability, and visibility. Key challenges include poor semantic alignment and data integration issues. The network will implement Linked Data principles by maximizing usability of data at the source, building a shared terminology network, and supporting a mix of semantic and physical/virtual integration approaches like federated querying. This will help realize the vision of a semantically integrated yet distributed network for digital heritage discovery.
Similar to Open Data Masterclass - Europeana and LOD (20)
Presentation lors de la journée "Vos collections sur Europeana – Panorama des voies d’agrégation" organisée par le Ministère de la Culture le 27 novembre 2018, à Paris
The Europeana Data Model Principles, community and innovationAntoine Isaac
This document summarizes the Europeana Data Model (EDM), which provides principles for representing metadata from cultural heritage institutions in a connected way on the web. EDM follows linked data best practices like using existing vocabularies and minimizing formalization. It represents metadata elements like full text, rights, and quality. Developing EDM involves experts from different domains and adopting a collaborative approach. Flexibility is needed to avoid overcommitment to formal semantics while reusing standards.
Presentaiton at Panel "Interoperable Platforms and CLIR Initiatives: A Global Perspective" at the 2019 IIIF Conference
Göttingen, Thursday 26 June 2019
https://iiif.io/event/2019/goettingen/program/30/
Multilingual challenges and ongoing work to tackle them at EuropeanaAntoine Isaac
Europeana is a digital platform that provides access to over 57 million digitized cultural heritage objects from 3,700 institutions across 44 countries. It faces challenges in being multilingual due to the large amount of metadata in over 400 languages. Europeana is working to tackle these issues through data modeling to allow for richer multilingual data, enriching metadata by linking it to external multilingual vocabularies, and exploring automatic translation of search results and content.
Semantic Interoperability at Europeana - MultilingualDSIs2018Antoine Isaac
Europeana is a digital platform containing over 58 million digitized cultural heritage objects from 3,700 institutions across 44 countries. The document discusses Europeana's efforts to improve semantic interoperability between these diverse datasets by developing the Europeana Data Model, enriching metadata by linking to external vocabularies, and building an Entity Collection and API to provide centralized access to contextual information about places, people, concepts, and organizations. The goal is to enable richer discovery, exploration, and reuse of Europeana's cultural heritage data on the web.
Lightweight rights modeling and linked data publication for online cultural h...Antoine Isaac
Presentation for the special session "Lightweight rights modeling and linked data publication for online cultural heritage - DCMI2018" at the DCMI2018 conference.
http://dublincore.org/conference/2018/abstracts/#a2
Designing a multilingual knowledge graph - DCMI2018Antoine Isaac
Presentation for the paper "Designing a multilingual knowledge graph as service for cultural heritage" at the DCMI2018 conference https://www.dublincore.org/conferences/2018/abstracts/#559
Presentation pour la journée IIIF Biblissima "Innover pour redécouvrir le patrimoine écrit", 15 mars 2018, Paris
http://www.biblissima-condorcet.fr/fr/actualites/innover-redecouvrir-patrimoine-ecrit-evenement-biblissima-iiif
Isaac - W3C Data on the Web Best Practices - Data VocabulariesAntoine Isaac
The document discusses best practices for using data vocabularies on the web as developed by the W3C Data on the Web Best Practices Working Group. It recommends reusing existing standardized vocabularies when possible and choosing the appropriate formalization level for data, avoiding both over-commitment to semantics and replication of existing vocabulary terms. It also describes Europeana's experience developing its data model EDM, which reuses many existing vocabularies while requiring significant effort to research, discuss, and maintain flexibility.
This document summarizes the Europeana APIs for accessing metadata and media from the Europeana digital collection. It describes the Search API and Record API, including how to perform basic searches and get search result profiles. It also provides examples of searching, getting search fields, and accessing record metadata in different formats. The document introduces the Europeana Data Model and how digital objects and representations are submitted and stored as proxies in Europeana.
The document discusses modelling and exchanging annotations for Europeana projects. It proposes adopting the W3C Web Annotation Data Model to represent annotations in RDF using JSON-LD serialization. An Annotations API based on the W3C Web Annotation Protocol allows exchanging annotations between Europeana and platforms like HistoryPin.org and Pundit. Representing metadata annotations is also discussed to make them machine-readable and shareable across interfaces. Overall, modelling annotations interoperably and exchanging them across platforms is still a work in progress.
EuropeanaTech update - Europeana AGM 2015Antoine Isaac
Update on the EuropeanaTech community activities. Presentation with Greg Markus, Sound and Vision. Europeana general Assembly Meeting 2015, November 2-4 2015. http://pro.europeana.eu/event/europeana-annual-general-meeting-2015
Modelling annotations for Europeana and related projects - DARIAH-EU WSAntoine Isaac
"Modelling annotations for Europeana and related projects" by Hugo Manguinhas, Antoine Isaac. DARIAH-EU Workshop on Practices and Context in Contemporary Annotation Activities, Hamburg, October 29-30, 2015.
Classification schemes, thesauri and other Knowledge Organization Systems - a...Antoine Isaac
"Classification schemes, thesauri and other Knowledge Organization Systems - a Linked Data perspective".
Presentation at the Pelagios Linked Pasts event, July 20-21, 2015.
http://pelagios-project.blogspot.co.uk/2015/03/linked-pasts.html
Multilingual challenges for accessing digitized culture online - Riga Summit 15Antoine Isaac
"Multilingual challenges for accessing digitized culture online". Presentation at the Riga Summit on the Multilingual Digital Single Market, April 27-29 2015.
http://www.rigasummit2015.eu/
1. Europeana and Linked Open Data
Antoine Isaac, Europeana
Open Data Masterclass, KB
9 October 2012
2. Making metadata work for Europeana
Building a search engine on top of metadata is difficult
Traditional metadata quality problems: correctness, coverage
Especially when data is so heterogeneous
100s of formats, multilingual data
We currently use a simple flat interoperability format
(ESE)
3. More semantics-enabled services
Enhance access by semantics
Query expansion, clustering of results
Exploiting various relations: "located in", “more specific concept"…
Goal: to make richer data and services available to us and
others
Semantics are already there, in original metadata
Thesauri, classifications…
ESE loses information
5. Linked Data principles
1. Use URIs as names for things
2. Use HTTP URIs so that people can look up those names
3. When someone looks up a URI, provide useful
information using standards (RDF, SPARQL)
4. Include links to other URIs, so that they can discover
more things
Tim Berners-Lee
http://linkeddata.org/
11. Matches interest for linked data in
libraries, archives and museums
• Library Linked Data W3C Community Group
http://www.w3.org/2005/Incubator/lld/XGR-lld/
• LOD-LAM community
http://lod-lam.net
• IFLA Semantic Web group
http://www.ifla.org/en/swsig
12. LLD - General benefits of linked data
• Shareable
Globally unique resolvable identifiers – URI
• Extensible
"Open world" - no description is complete, anybody can add descriptive
information about the same thing
Annotations, enrichments, etc.
• Standard protocols/techniques vs. domain-specific ones
HTTP, RDF, REST
Etc.
http://www.w3.org/2005/Incubator/lld/XGR-lld/
14. Available Library Linked Data
• Element sets/schemas/ontologies
SKOS, Dublin Core, OAI-ORE…
• Value vocabularies/thesauri/authority lists
LCSH, VIAF…
• Datasets
British Library, Chronicling America…
15. Europeana and Linked Data
Provide trusted, reference data for cultural objects
Promote the use of the technology
Promoting the exchange of data in the community and
with third parties: Open (meta)data!
19. Re-use and linking
Currently: GeoNames, GEMET…
Data re-use can be serendipitous!
From our domain (VIAF, UDC) or others (Eurovoc)
Multilingual resources are key for us
20. Europeana Data Model
• Representing objects & others: persons, places...
• Linking to internal or external data sources
• Separating original data from enrichments
• Enabling domain-specific data profiles
• Model re-uses existing vocabularies
http://pro.europeana.eu/edm-documentation
21. What is submitted to Europeana?
1. Thumbnails
2. Metadata
3. Links to digital objects online
We created a Linked Open data pilot. Via word of mouth practically, we invited some of our partners to allow us to publish their data as LOD. This 3m data is now online. If you want to know more, check Data.europeana.eu
We created a Linked Open data pilot. Via word of mouth practically, we invited some of our partners to allow us to publish their data as LOD. This 3m data is now online. If you want to know more, check Data.europeana.eu