Empire State Digital Network (ESDN) is a New York State service hub for DPLA. This brief project update was presented at the ASCLA Collaborative Digitization Interest Group at ALA Midwinter 2015 in Chicago.
The document discusses UNLV Libraries' project to transform their digital collection metadata into linked open data. It describes how the project started as a study group and literature review in 2012. The goals were to preserve metadata richness when converting to a standard like Dublin Core and improve discoverability by publishing in the Linked Data Cloud. Technologies used included ContentDM, OpenRefine, Karma, Mulgara/Virtuoso triplestores, and SPARQL. The process involved cleaning, exporting, reconciling, generating RDF triples, importing to a triplestore, publishing, and querying the data. Visualizations were created using PivotViewer and RelFinder to showcase relationships. Next steps include transforming all collections and increasing linkages to other datasets.
LACNIC report as presented by Sergio Rojas at ARIN's Public Policy and Members Meeting in April 2014. All ARIN 33 presentations are posted online at: https://www.arin.net/ARIN33_materials
OpenHIstoricalMap - HIstorical Geography Wiki Stylenfgusedautoparts
Richard Welty gave a presentation on OpenHistoricalMap (OHM), a project to add historical data and a time dimension to OpenStreetMap (OSM). OHM uses the same data model and software as OSM but adds support for modeling changes over time. It has a time-aware map viewer and tools for aligning historical maps and images. Current projects focus on modeling urban development, infrastructure changes, and places of historical and cultural significance. Challenges include representing time-varying data accurately and modeling historical events.
This document discusses workpackage 4 of the HABITATS project, which aims to define a spatial data infrastructure network to enable sharing of habitat-related spatial data across Europe. The objectives are to analyze existing SDIs, define an INSPIRE-compliant architecture, deploy interoperability services, and develop a service toolkit. Deliverables include reports on the state of SDIs, the INSPIRE networking architecture, HABITATS networking services and service toolkit. The Reference Laboratory is introduced as a central hub to support data sharing and pilot applications. Advanced principles for the RL include supporting work with maps and services, extending INSPIRE services using KML, and including linked data functionality.
The document summarizes announcements from the 2016 ER & L CORAL User Group meeting. It lists the steering committee and web committee members, as well as new governance rules and affiliate members. It also provides highlights from a user survey report and previews upcoming plans for the web and steering committees, including updated documentation, improved versioning, and new software features.
This XML Prague 2015 Pre-conference presentations shows practical usage of linked data sources. These sources can help to: enrich content with entities, add link to external data sources, use the enriched content in question answering, machine translation or other scenarios. The aim is to show the practical application of linked data sources in XML tooling. The presentation is an update and provides outcomes of the related session held at XML Prague 2014.
Infrastructure - A necessary platform for user empowermentRICHES
Presentation at the conference ICLAM 2011 in New Delhi, 15th-17th of February, 2011
http://www.flickr.com/groups/1658954@N22/
http://www.flickr.com/photos/rokal/sets/72157626087508810/
The document discusses UNLV Libraries' project to transform their digital collection metadata into linked open data. It describes how the project started as a study group and literature review in 2012. The goals were to preserve metadata richness when converting to a standard like Dublin Core and improve discoverability by publishing in the Linked Data Cloud. Technologies used included ContentDM, OpenRefine, Karma, Mulgara/Virtuoso triplestores, and SPARQL. The process involved cleaning, exporting, reconciling, generating RDF triples, importing to a triplestore, publishing, and querying the data. Visualizations were created using PivotViewer and RelFinder to showcase relationships. Next steps include transforming all collections and increasing linkages to other datasets.
LACNIC report as presented by Sergio Rojas at ARIN's Public Policy and Members Meeting in April 2014. All ARIN 33 presentations are posted online at: https://www.arin.net/ARIN33_materials
OpenHIstoricalMap - HIstorical Geography Wiki Stylenfgusedautoparts
Richard Welty gave a presentation on OpenHistoricalMap (OHM), a project to add historical data and a time dimension to OpenStreetMap (OSM). OHM uses the same data model and software as OSM but adds support for modeling changes over time. It has a time-aware map viewer and tools for aligning historical maps and images. Current projects focus on modeling urban development, infrastructure changes, and places of historical and cultural significance. Challenges include representing time-varying data accurately and modeling historical events.
This document discusses workpackage 4 of the HABITATS project, which aims to define a spatial data infrastructure network to enable sharing of habitat-related spatial data across Europe. The objectives are to analyze existing SDIs, define an INSPIRE-compliant architecture, deploy interoperability services, and develop a service toolkit. Deliverables include reports on the state of SDIs, the INSPIRE networking architecture, HABITATS networking services and service toolkit. The Reference Laboratory is introduced as a central hub to support data sharing and pilot applications. Advanced principles for the RL include supporting work with maps and services, extending INSPIRE services using KML, and including linked data functionality.
The document summarizes announcements from the 2016 ER & L CORAL User Group meeting. It lists the steering committee and web committee members, as well as new governance rules and affiliate members. It also provides highlights from a user survey report and previews upcoming plans for the web and steering committees, including updated documentation, improved versioning, and new software features.
This XML Prague 2015 Pre-conference presentations shows practical usage of linked data sources. These sources can help to: enrich content with entities, add link to external data sources, use the enriched content in question answering, machine translation or other scenarios. The aim is to show the practical application of linked data sources in XML tooling. The presentation is an update and provides outcomes of the related session held at XML Prague 2014.
Infrastructure - A necessary platform for user empowermentRICHES
Presentation at the conference ICLAM 2011 in New Delhi, 15th-17th of February, 2011
http://www.flickr.com/groups/1658954@N22/
http://www.flickr.com/photos/rokal/sets/72157626087508810/
Update on the Bentley Historical Library's ArchivesSpace Archivematica DSpace Workflow Integration project, with development provided by Artefactual Systems.
Update on the University of Michigan Bentley Historical Library's "ArchivesSpace, Archivematica - Dspace Workflow Integration" project (funded by a generous grant from the Andrew W. Mellon Foundation). The project seeks to integrate these platforms into an end-to-end digital archives workflow that will facilitate the deposit of content into a digital repository and enable the reuse of descriptive and administrative metadata across platforms. This presentation was made to the March 27, 2015 meeting of the Mid-Michigan Digital Practitioners in Ann Arbor.
This presentation provides an introduction and walk-through of the LoCloud Geocoding application, used during the LoCloud training workshops. The application is a tool to add geographic coordinates to existing data (such as records describing items of content in digital libraries). The presentation includes a step-by-step walk through of the application
The IT committee update document discusses:
1) The IT committee has 18 members and is recruiting new members until December 12th. 2) Past meetings included discussions around server virtualization, alternative solutions for the infocenter, a new Galaxy strategy, and mobile app ideas. 3) Upcoming meetings will take place in Brussels from December 14-16 with 8-10 members attending.
Whitebox GAT - an introduction by its developerRobin Lovelace
John Lindsay, the main developer of the little known but extremely powerful GIS program Whitebox Geospatial Analysis Tools, describes his software at GISRUK 2014 in Glasgow.
Lovingly crafting a mountain, not by hand: managing piles of metadataGalen Charlton
Presentation at the British Columbia Library Conference on 1 April 2014.
Economics, time, and the burgeoning increase in the numbers of resources that libraries are acquiring or providing access to all conspire against being able to spend much time getting a metadata record perfect. Sometimes, getting a record barely good enough can be a challenge -- one record down, 50,000 more to go. In this session, Galen Charlton will discuss tools and techniques for managing ever-larger piles of metadata using open source tools, with an emphasis on iterative improvement and distributed collaboration.
Drupal Day 2011 - Thinking spatially with your open dataDrupalDay
Talk di Juan Arevalo & Marco Giacomassi | Drupal Day Roma 2011
The Open Data movement is now moving a step forward, many governments, institutions and business have recently started the process of making information available to citizens and customers. Data is now seen as a powerful instrument to increase transparency in public administration and business on policies. About 80% of this information has a spatial component that is not entirely exploited yet. A range of open source solutions are now available to address this challenge, in this session we will explore their potential and possible applications. The so-called “data deluge” is here.. but we can build good umbrellas. Please come to learn more about it!
Technology & Archives: Exchange Forum Programmer & Archivist CollaborationMatthew Critchlow
The document discusses collaboration between programmers and archivists at UC San Diego Library. It describes how they use an agile development approach called sprints, where cross-functional teams work in short cycles to develop digital library products and services. This approach promotes information sharing and coordination across projects, but can also lead to role ambiguity and increased coordination costs. The document also provides references to open-source technical tools and platforms used by the digital collections team.
The document provides an overview of the WASAPI project funded by IMLS to develop data transfer APIs between web archiving repositories. The project involves the Internet Archive, Stanford University, Rutgers University, and the University of North Texas working from 2016 to 2018 to build community, model preservation networks, and develop APIs for Archive-It and LOCKSS to standardize researcher access and exchange of archived data between service providers, repositories, and research workspaces.
The document discusses RIPE Atlas, a global Internet measurement network. It provides an overview of RIPE Atlas and its tools and use cases. Specific topics covered include IXP Country Jedi, which uses RIPE Atlas data to analyze how traffic is routed via internet exchange points; TraceMON, a new tool for visualizing traceroute data; and how network operators and others can get involved with and contribute to RIPE Atlas.
From Simple Features to Moving Features and Beyond? at OGC Member Meeting, Se...Anita Graser
Presentation of arxiv preprint https://arxiv.org/abs/2006.16900
Mobility data science lacks common data structures and analytical functions. This position paper assesses the current status and open issues towards a universal API for mobility data science. In particular, we look at standardization efforts revolving around the OGC Moving Features standard which, so far, has not attracted much attention within the mobility data science community. We discuss the hurdles any universal API for movement data has to overcome and propose key steps of a roadmap that would provide the foundation for the development of this API.
Towards INSPIRE environmental 5* Open Data Martin Tuchyna
This document discusses exposing INSPIRE and other geo data and metadata to the semantic web. It outlines the main objective, current status, work done so far including transforming and publishing data. The outcomes of publishing linked RDF data, metadata, and APIs are described. Benefits include combining datasets while challenges involve new ways of thinking and toolset support. The forecast includes migrating to new infrastructure, enriching data with links, and improving visualization and awareness raising activities.
This presentation is a case study of the Art Institute of Chicago’s DAMS project.
LAKE, the AIC DAMS, is entirely based on modern Web standards and open-source software built in collaboration with several cultural heritage institutions. Its first beta release was launched in September 2016 and a full release is planned for early 2017.
In this session we will describe: 1) the scenario pre-dating LAKE; 2) the thought process and design phase that led to choosing the technology we are using; 3) the implementation steps and challenges; and 4) the current status and plans for expansion and long-term sustainability.
Using FME and GTFS datasets to run TransitDatabase.comSafe Software
Public Transit data is largely available in an open-source data format, GTFS. Using FME, this presentation will describe how to efficiently synthesize the extremely large datasets, from cities all over the world into meaningful information used to run TransitDatabase.com.
This document summarizes Paul Rendek's presentation at the NIX.CZ meeting on November 24, 2016. The presentation discussed the history of internet development in the Czech Republic, the changing makeup of RIPE NCC members, key moments for internet governance following the IANA transition, and challenges around securing an open internet framework with the rise of IoT technologies. Rendek emphasized the importance of building strong local technical communities to own debates on issues central to their work and influence in policy discussions.
CTAA 2016 Portland - Aaron Antrim - GTFS - What is it? Why does it matter?Aaron Antrim
This document provides an overview of the General Transit Feed Specification (GTFS) which is an open data standard that allows public transit agencies to publish their transit data in a common format so that developers can build applications using that transit data. The document discusses what GTFS is, how it started with Google Transit, and examples of applications that have been built using GTFS data including trip planners, real-time arrival information, and tools for integrating transit data with other maps and services.
1) The document discusses using the Tiki Wiki CMS system to manage information flows at the Statistics and Bioinformatics Unit at Vall d'Hebron Institut de Recerca.
2) Key features of Tiki that are highlighted include wiki pages, structures and permissions to manage documentation, trackers for databases, and calendars for planning. It also allows connecting R to the web through plugins.
3) Examples provided include using Tiki to create task trackers, study databases, and an interactive heatmap tool powered by R. The system allows automating workflows and sharing results in a customizable centralized system.
Presentation given by Dr. Dimitris Gavrilis
Digital Curation Unit - IMIS, Athena Research Center
LoCloud Conference
Sharing local cultural heritage online with LoCloud services Amersfoort, Netherlands
5 February 2016
The document discusses improvements and outreach plans for DMPTool2. Key points include:
1. DMPTool2 adds new roles and permissions for administrators, allows collaborative plan creation, and improves search functionality.
2. New roles include templates editor, resource editor, and reviewer to help institutions implement their own requirements.
3. An outreach plan includes webinars, blogs, and advisory boards to gather input from researchers and administrators on the tool.
4. DMPTool2 will launch in January 2014 for administrators and February 2014 for public use at a new URL, while the original tool remains available temporarily.
Toward a National Digital Network: An Update from DPLA and ESDN - Metro Annua...kerriwillette
Presentation and panel session at the Metropolitan New York Library Council (METRO) Annual Conference 2015 held at Baruch College on January 15, 2015. Panel included Kerri Willette (ESDN Manager), Chris Stanton (ESDN Metadata Specialist), John Mignault (ESDN Technology Specialist), and Mark Matienzo (DPLA Director of Technology), moderated by Davis Erin Anderson (METRO Community Engagement Manager).
Update on the Bentley Historical Library's ArchivesSpace Archivematica DSpace Workflow Integration project, with development provided by Artefactual Systems.
Update on the University of Michigan Bentley Historical Library's "ArchivesSpace, Archivematica - Dspace Workflow Integration" project (funded by a generous grant from the Andrew W. Mellon Foundation). The project seeks to integrate these platforms into an end-to-end digital archives workflow that will facilitate the deposit of content into a digital repository and enable the reuse of descriptive and administrative metadata across platforms. This presentation was made to the March 27, 2015 meeting of the Mid-Michigan Digital Practitioners in Ann Arbor.
This presentation provides an introduction and walk-through of the LoCloud Geocoding application, used during the LoCloud training workshops. The application is a tool to add geographic coordinates to existing data (such as records describing items of content in digital libraries). The presentation includes a step-by-step walk through of the application
The IT committee update document discusses:
1) The IT committee has 18 members and is recruiting new members until December 12th. 2) Past meetings included discussions around server virtualization, alternative solutions for the infocenter, a new Galaxy strategy, and mobile app ideas. 3) Upcoming meetings will take place in Brussels from December 14-16 with 8-10 members attending.
Whitebox GAT - an introduction by its developerRobin Lovelace
John Lindsay, the main developer of the little known but extremely powerful GIS program Whitebox Geospatial Analysis Tools, describes his software at GISRUK 2014 in Glasgow.
Lovingly crafting a mountain, not by hand: managing piles of metadataGalen Charlton
Presentation at the British Columbia Library Conference on 1 April 2014.
Economics, time, and the burgeoning increase in the numbers of resources that libraries are acquiring or providing access to all conspire against being able to spend much time getting a metadata record perfect. Sometimes, getting a record barely good enough can be a challenge -- one record down, 50,000 more to go. In this session, Galen Charlton will discuss tools and techniques for managing ever-larger piles of metadata using open source tools, with an emphasis on iterative improvement and distributed collaboration.
Drupal Day 2011 - Thinking spatially with your open dataDrupalDay
Talk di Juan Arevalo & Marco Giacomassi | Drupal Day Roma 2011
The Open Data movement is now moving a step forward, many governments, institutions and business have recently started the process of making information available to citizens and customers. Data is now seen as a powerful instrument to increase transparency in public administration and business on policies. About 80% of this information has a spatial component that is not entirely exploited yet. A range of open source solutions are now available to address this challenge, in this session we will explore their potential and possible applications. The so-called “data deluge” is here.. but we can build good umbrellas. Please come to learn more about it!
Technology & Archives: Exchange Forum Programmer & Archivist CollaborationMatthew Critchlow
The document discusses collaboration between programmers and archivists at UC San Diego Library. It describes how they use an agile development approach called sprints, where cross-functional teams work in short cycles to develop digital library products and services. This approach promotes information sharing and coordination across projects, but can also lead to role ambiguity and increased coordination costs. The document also provides references to open-source technical tools and platforms used by the digital collections team.
The document provides an overview of the WASAPI project funded by IMLS to develop data transfer APIs between web archiving repositories. The project involves the Internet Archive, Stanford University, Rutgers University, and the University of North Texas working from 2016 to 2018 to build community, model preservation networks, and develop APIs for Archive-It and LOCKSS to standardize researcher access and exchange of archived data between service providers, repositories, and research workspaces.
The document discusses RIPE Atlas, a global Internet measurement network. It provides an overview of RIPE Atlas and its tools and use cases. Specific topics covered include IXP Country Jedi, which uses RIPE Atlas data to analyze how traffic is routed via internet exchange points; TraceMON, a new tool for visualizing traceroute data; and how network operators and others can get involved with and contribute to RIPE Atlas.
From Simple Features to Moving Features and Beyond? at OGC Member Meeting, Se...Anita Graser
Presentation of arxiv preprint https://arxiv.org/abs/2006.16900
Mobility data science lacks common data structures and analytical functions. This position paper assesses the current status and open issues towards a universal API for mobility data science. In particular, we look at standardization efforts revolving around the OGC Moving Features standard which, so far, has not attracted much attention within the mobility data science community. We discuss the hurdles any universal API for movement data has to overcome and propose key steps of a roadmap that would provide the foundation for the development of this API.
Towards INSPIRE environmental 5* Open Data Martin Tuchyna
This document discusses exposing INSPIRE and other geo data and metadata to the semantic web. It outlines the main objective, current status, work done so far including transforming and publishing data. The outcomes of publishing linked RDF data, metadata, and APIs are described. Benefits include combining datasets while challenges involve new ways of thinking and toolset support. The forecast includes migrating to new infrastructure, enriching data with links, and improving visualization and awareness raising activities.
This presentation is a case study of the Art Institute of Chicago’s DAMS project.
LAKE, the AIC DAMS, is entirely based on modern Web standards and open-source software built in collaboration with several cultural heritage institutions. Its first beta release was launched in September 2016 and a full release is planned for early 2017.
In this session we will describe: 1) the scenario pre-dating LAKE; 2) the thought process and design phase that led to choosing the technology we are using; 3) the implementation steps and challenges; and 4) the current status and plans for expansion and long-term sustainability.
Using FME and GTFS datasets to run TransitDatabase.comSafe Software
Public Transit data is largely available in an open-source data format, GTFS. Using FME, this presentation will describe how to efficiently synthesize the extremely large datasets, from cities all over the world into meaningful information used to run TransitDatabase.com.
This document summarizes Paul Rendek's presentation at the NIX.CZ meeting on November 24, 2016. The presentation discussed the history of internet development in the Czech Republic, the changing makeup of RIPE NCC members, key moments for internet governance following the IANA transition, and challenges around securing an open internet framework with the rise of IoT technologies. Rendek emphasized the importance of building strong local technical communities to own debates on issues central to their work and influence in policy discussions.
CTAA 2016 Portland - Aaron Antrim - GTFS - What is it? Why does it matter?Aaron Antrim
This document provides an overview of the General Transit Feed Specification (GTFS) which is an open data standard that allows public transit agencies to publish their transit data in a common format so that developers can build applications using that transit data. The document discusses what GTFS is, how it started with Google Transit, and examples of applications that have been built using GTFS data including trip planners, real-time arrival information, and tools for integrating transit data with other maps and services.
1) The document discusses using the Tiki Wiki CMS system to manage information flows at the Statistics and Bioinformatics Unit at Vall d'Hebron Institut de Recerca.
2) Key features of Tiki that are highlighted include wiki pages, structures and permissions to manage documentation, trackers for databases, and calendars for planning. It also allows connecting R to the web through plugins.
3) Examples provided include using Tiki to create task trackers, study databases, and an interactive heatmap tool powered by R. The system allows automating workflows and sharing results in a customizable centralized system.
Presentation given by Dr. Dimitris Gavrilis
Digital Curation Unit - IMIS, Athena Research Center
LoCloud Conference
Sharing local cultural heritage online with LoCloud services Amersfoort, Netherlands
5 February 2016
The document discusses improvements and outreach plans for DMPTool2. Key points include:
1. DMPTool2 adds new roles and permissions for administrators, allows collaborative plan creation, and improves search functionality.
2. New roles include templates editor, resource editor, and reviewer to help institutions implement their own requirements.
3. An outreach plan includes webinars, blogs, and advisory boards to gather input from researchers and administrators on the tool.
4. DMPTool2 will launch in January 2014 for administrators and February 2014 for public use at a new URL, while the original tool remains available temporarily.
Toward a National Digital Network: An Update from DPLA and ESDN - Metro Annua...kerriwillette
Presentation and panel session at the Metropolitan New York Library Council (METRO) Annual Conference 2015 held at Baruch College on January 15, 2015. Panel included Kerri Willette (ESDN Manager), Chris Stanton (ESDN Metadata Specialist), John Mignault (ESDN Technology Specialist), and Mark Matienzo (DPLA Director of Technology), moderated by Davis Erin Anderson (METRO Community Engagement Manager).
This paper surveys the landscape of linked open data projects in cultural heritage, exam- ining the work of groups from around the world. Traditionally, linked open data has been ranked using the five star method proposed by Tim Berners-Lee. We found this ranking to be lacking when evaluating how cultural heritage groups not merely develop linked open datasets, but find ways to used linked data to augment user experience. Building on the five-star method, we developed a six-stage life cycle describing both dataset development and dataset usage. We use this framework to describe and evaluate fifteen linked open data projects in the realm of cultural heritage.
The document provides an overview of the Dublinked Technology Workshop held on December 15th, 2011. It includes presentations on transportation data, spatial web services, linked data, and semantic data description. Breakout sessions covered topics like data publishing, discovery, web services, and advanced functions. The workshop aimed to address challenges around sharing digital data between organizations and discussed technical requirements and tools to support open government data platforms.
This presentation was provided by Ted Koppel ofAuto-Graphics, Inc, Ed Riding of SirsiDynix, Andrew K. Pace of OCLC, and John Mark Ockerbloom of The University of Pennsylvania, during the NISO webinar "Library Systems & Interoperability: Breaking Down Silos," held on June 10, 2009.
Development of a MODS-RDF Cataloguing Tool for Information Professionals CONU...Lucy McKenna
Generating bibliographic records as linked data (LD) offers the opportunity for libraries to publish and interlink metadata on the semantic web (SW). This can expose library resources to a larger audience, increase the use of library materials, and allow for more efficient searches. The Digital Resources and Imaging Services (DRIS) department of the Library of Trinity College Dublin (TCD) hopes to move towards publishing their bibliographic records as LD and, therefore, requires a tool that allows for the creation of records in RDF - a model for representing and exchanging LD on the web as structured data.
Although libraries are publishing LD in increasing quantities there remains many barriers to librarians making full use of the SW, including that many tools used for generating LD are aimed at technical experts. This project explored a means of overcoming some of these barriers through the development a MODS-RDF cataloguing tool for use in the library domain. MODS is a highly flexible XML metadata schema that can be used to catalogue cultural heritage materials, and MODS-RDF is an expression of this schema in RDF.
A user-centred design approach, which focuses on designing an interface from the perspective of its users, was followed when developing the tool. As such, DRIS was involved in all stages of development, including requirements gathering, interface prototyping and design, and usability testing. The results of the first phase of usability testing indicated that many of the initial user requirements were met and that DRIS were interested in developing the interface further. These results are being used to inspire the second iteration of the tool. Ongoing usability testing will be conducted to ensure that the resulting interface meets DRIS’ unique needs.
By developing a tool that allows DRIS to produce MODS-RDF records, the library will be able to interlink with other LD resources. This could allow library users to access a web of related data from a single information search, making the research process more efficient and potentially inspiring new research through the linking of disparate collections.
‘Development of a MODS-RDF Cataloguing Tool for the Digital Resources and Ima...CONUL Conference
The ADAPT Centre collaborated with Digital Resources and Imaging Services (DRIS) of Trinity College Dublin to develop a MODS-RDF cataloguing tool. The tool allows DRIS cataloguers to generate MODS and RDF metadata for digital collections in a user-friendly interface. Usability testing identified improvements and new requirements. The tool facilitates publishing library metadata as linked data on the semantic web to improve discovery and sharing of resources across institutions.
ResourceSync - Overview and Real-World Use Cases for Discovery, Harvesting, a...Martin Klein
This document provides an overview of ResourceSync, which is a framework for synchronizing web resources between systems. Some key points:
- ResourceSync was created to address limitations of existing protocols like OAI-PMH by allowing synchronization of any web resource and enabling both one-time and ongoing synchronization.
- It supports various capabilities for synchronization like resource lists, change lists, and notifications. These can be used for initial synchronization or incremental updates.
- Real-world examples are described where ResourceSync has been implemented for projects involving aggregation of digital collections, like Europeana and CLARIAH. It facilitates synchronization between diverse data sources.
- Presentations were given on how ResourceSync could also be useful
Resource sync overview and real-world use cases for discovery, harvesting, an...openminted_eu
This document summarizes an overview presentation about ResourceSync and its implementations at Hyku and the Digital Public Library of America (DPLA). Some key points:
- ResourceSync was developed as an update to OAI-PMH for synchronizing web resources between systems in a more flexible way. It supports resource lists, change lists, and dumps.
- Hyku has implemented ResourceSync publishing capabilities, and the DPLA has developed a harvester for the Hyku endpoint. This allows for incremental metadata updates rather than full resynchronization of data sets.
- Next steps include potentially supporting resource dumps in Hyku and harvesting from 3 DPLA providers using ResourceSync by the end of the year
A Survey of Exploratory Search Systems Based on LOD ResourcesKarwan Jacksi
The document summarizes Karwan Jacksi's presentation on exploratory search systems based on Linked Open Data (LOD) resources at the International Conference on Computing and Informatics in Istanbul, 2015. The presentation discusses search strategies, the semantic web, linked data, existing linked data browsers and recommenders. It then summarizes several existing exploratory search systems that utilize LOD resources, including Yovisto, Semantic Wonder Cloud, Lookup Explore Discover, Aemoo, Seevl, Linked Jazz, Discovery Hub, and inWalk. The presentation also covers computing semantic similarity, linked data techniques, and references.
MWDL as a Service Hub for the Digital Public Library of America: Updates and ...Rebekah Cummings
The Mountain West Digital Library (MWDL) is serving as a service hub for the Digital Public Library of America (DPLA) pilot program. As part of this role, MWDL is expanding its services to include additional partner repositories, digitization projects, and training programs. The goals of the DPLA pilot program are to lay foundational infrastructure, empower local institutions, and inspire community engagement. MWDL is refining its service and funding models to ensure long-term sustainability of its expanded role providing access to digital collections across the region.
Leveraging Wikipedia as a Hub for Data Integration: the Remixing Archival Metadata Project (RAMP)
Timothy A. Thompson, Metadata Librarian (Spanish/Portuguese Specialty), Princeton University Library
The benefits of Linked Data are well known, but the supporting software ecosystem is still somewhat lacking. During this presentation we will look into the approach taken by Joinup: How we start from a formalized ontology, and map this to the Joinup website. We’ll give an overview of the Open Source components that we created for building linked data based CMS applications.
Toward Semantic Data Stream - Technologies and ApplicationsRaja Chiky
Massive data stream processing is a scientific challenge and an industrial concern. But with the current volumes of data streams , their number and variety, current techniques are not able to meet the requirements of applications. The Semantic Web tools , through the RDF for example, allow to address the problem of heterogeneous data. Thus, the data stream are converted to semantic data stream by using RDF triples extended with a timestamp. To be able to query , filter, or reason semantic data streams, the query language SPARQL must be extended to include concepts such as windowing , based on what has been done in Data Stream Management Systems. In this talk, I will present recent work on the semantic data stream management , particularly extensions made on SPARQL language and associated benchmarks.
Understanding Globus Data Transfers with NetSageGlobus
NetSage is an open privacy-aware network measurement, analysis, and visualization service designed to help end-users visualize and reason about large data transfers. NetSage traditionally has used a combination of passive measurements, including SNMP and flow data, as well as active measurements, mainly perfSONAR, to provide longitudinal network performance data visualization. It has been deployed by dozens of networks world wide, and is supported domestically by the Engagement and Performance Operations Center (EPOC), NSF #2328479. We have recently expanded the NetSage data sources to include logs for Globus data transfers, following the same privacy-preserving approach as for Flow data. Using the logs for the Texas Advanced Computing Center (TACC) as an example, this talk will walk through several different example use cases that NetSage can answer, including: Who is using Globus to share data with my institution, and what kind of performance are they able to achieve? How many transfers has Globus supported for us? Which sites are we sharing the most data with, and how is that changing over time? How is my site using Globus to move data internally, and what kind of performance do we see for those transfers? What percentage of data transfers at my institution used Globus, and how did the overall data transfer performance compare to the Globus users?
Mountain West Digital Library as a Service Hub for the Digital Public Library...Sandra McIntyre
Webinar from the Mountain West Digital Library
Sandra McIntyre, MWDL Director
Rebekah Cummings, MWDL Assistant Director/Outreach Librarian
The Mountain West Digital Library (MWDL) provides a central search portal to over 800,000 digital resources from memory institutions in Utah, Nevada, Idaho, Arizona, and Hawaii. As a program of the Utah Academic Library Consortium for the last twelve years, MWDL brings together 122 partners, including academic libraries, public libraries, archives, museums, historical societies, and government agencies, to share expertise and resources for digitization, hosting, and aggregated search. As one of the first six Service Hubs to the Digital Public Library of America, MWDL provides the on-ramp for DPLA participation to memory institutions in the Mountain West.
Sandra and Rebekah will talk about how MWDL became a Service Hub for the DPLA and what being a Service Hub entails. They will also discuss upcoming MWDL/DPLA announcements and events such as the digitization mini-contracts program and the DPLA Community Representatives program.
Similar to Empire State Digital Network (ESDN) Project Update for ASCLA-CDIG, Midwinter 2015 (20)
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
Walmart Business+ and Spark Good for Nonprofits.pdfTechSoup
"Learn about all the ways Walmart supports nonprofit organizations.
You will hear from Liz Willett, the Head of Nonprofits, and hear about what Walmart is doing to help nonprofits, including Walmart Business and Spark Good. Walmart Business+ is a new offer for nonprofits that offers discounts and also streamlines nonprofits order and expense tracking, saving time and money.
The webinar may also give some examples on how nonprofits can best leverage Walmart Business+.
The event will cover the following::
Walmart Business + (https://business.walmart.com/plus) is a new shopping experience for nonprofits, schools, and local business customers that connects an exclusive online shopping experience to stores. Benefits include free delivery and shipping, a 'Spend Analytics” feature, special discounts, deals and tax-exempt shopping.
Special TechSoup offer for a free 180 days membership, and up to $150 in discounts on eligible orders.
Spark Good (walmart.com/sparkgood) is a charitable platform that enables nonprofits to receive donations directly from customers and associates.
Answers about how you can do more with Walmart!"
Temple of Asclepius in Thrace. Excavation resultsKrassimira Luka
The temple and the sanctuary around were dedicated to Asklepios Zmidrenus. This name has been known since 1875 when an inscription dedicated to him was discovered in Rome. The inscription is dated in 227 AD and was left by soldiers originating from the city of Philippopolis (modern Plovdiv).
This document provides an overview of wound healing, its functions, stages, mechanisms, factors affecting it, and complications.
A wound is a break in the integrity of the skin or tissues, which may be associated with disruption of the structure and function.
Healing is the body’s response to injury in an attempt to restore normal structure and functions.
Healing can occur in two ways: Regeneration and Repair
There are 4 phases of wound healing: hemostasis, inflammation, proliferation, and remodeling. This document also describes the mechanism of wound healing. Factors that affect healing include infection, uncontrolled diabetes, poor nutrition, age, anemia, the presence of foreign bodies, etc.
Complications of wound healing like infection, hyperpigmentation of scar, contractures, and keloid formation.
Main Java[All of the Base Concepts}.docxadhitya5119
This is part 1 of my Java Learning Journey. This Contains Custom methods, classes, constructors, packages, multithreading , try- catch block, finally block and more.
Bed Making ( Introduction, Purpose, Types, Articles, Scientific principles, N...
Empire State Digital Network (ESDN) Project Update for ASCLA-CDIG, Midwinter 2015
1. EMPIRE STATE
DIGITAL NETWORK
New York Service Hub of the DPLA
Kerri Willette, ESDN Manager
ASCLA Collaborative Digitization Interest Group
January 31, 2015
4. Timeline
• Hire staff
• Convene advisory &
working groups
• Develop infrastructure
• Establish workflows
• Gather data from NY 3Rs
hosted collections
• Create and distribute
guidelines for contribution
• Outreach to LAMS
statewide
• Grow New York content in
DPLA
Phase 1
October 2013 - April 2015
Phase 2
May 2015 - April 2016
8. Mapping
MODS is mapped to the DPLA MAP (Metadata Application Profile)
http://dp.la/info/developers/map/
Closely follows DPLA’s preferred MODS implementation (NCDHC!)
9. Required/Recommended Fields
Required (per record)
Title
Rights
Link to record on local site
Link to content preview (where applicable)
Thumbnail
Other fields recommended and/or optional
Place
Subject
Date
Type
Language
http://www.mnylc.org/esdn/contributors/metadata-
requirements/
10. Next Steps
DPLA is harvesting our first set of almost 90,000 records now.
New York State content should be live in DPLA in late February.
We’ll open up contributions more broadly in April.
NY3Rs is a collaborative of 9 library resources councils statewide. ESDN is administered by the NY3Rs but we’re hosted and funded primarily by one of those regions, METRO.
There are 4 regionally supported collaborative digitation projects in the state. These are supported by the regional councils. Each of the councils has a digital services manager or coordinator on staff. ESDN is coordinating with these council liaisons to coordinate contributions to DPLA statewide.
So, When the ESDN was announced in OCT 2013, a 3-year project plan was put in place. The plan was divided into two phases.
As you can see, we are now in the home stretch of Phase 1 which has focused primarily on laying the necessary groundwork to get the hub up and running.
Now we’re preparing to move into phase 2 this spring. Phase 2 is when the hub will really open for business. That’s when we will begin to open up participation more broadly throughout the state.
Phase 1 – we primarily focused on hosted 3R’s collections in the state – ie. low hanging fruit. We used select collections from regional projects and a few additional partners to get our infrastructure and workflows ironed out.
Phase 1
First contribution to DPLA – happening now! 89,626 records
Schedule monthly ongoing harvests
Coordinate outreach and communication through 3R’s regional liaisons in preparation for Phase 2
So when we open up contribution more broadly in Phase 2 and begin bringing in content from all sorts of organizations statewide, we have some idea now of what our process needs to look like.
Phase 2
Broaden contribution
Streamline contribution processes
Continue adding regionally hosted content
Work with regional liaisons to include non-hosted partners statewide
Grow New York content in DPLA
Of course not every institution in the state has their digital content in one of these hosted projects. So, in addition to growing content from the existing hosted projects, regional liaisons will also facilitate partnerships with institutions that host collections locally.
The ESDN Regional Liaisons are already meeting as a group and working with ESDN staff to coordinate consistent and clear communication statewide.
Partner metadata coming from CONTENTdm, Islandora, CollectiveAccess, ArchivalWare, etc. – 71 institutions
Pull in data from multiple formats map to ESDN’s MODS model. We apply transformations to address weird system quirks, to fix inconsistencies in date fields and other provider idiosycracies. We output one single stream of normalized data to DPLA.
Mapping our “single stream of data”, MODS, to the DPLA Metadata Model.
Benefited from work of existing service hubs, like North Carolina Digital Heritage Center, who are also providing MODS to the DPLA
DPLA MAP specifies required/recommended fields
Few required fields, more optional and recommended fields
Low-barrier to entry
Work 1 on 1 with partners – do field mapping in spreadsheets – use some tools developed by NC to review various aspects of the data and send feedback to the institution to do clean-up or mapping changes.
Check for presence of required and recommended fields
Review data feed for inconsistencies and mapping issues
Help to configure/troubleshoot OAI
Provide feedback to partners
We don’t ever have direct access to our providers’ data. Sometimes we’re two or three steps away from direct contact with providers, so communicating changes can be cumbersome.