Brad Houston presented information on data management plans (DMPs) required by the National Science Foundation (NSF) for grant proposals. He explained that DMPs must describe the data to be collected or generated, how it will be organized and formatted, and how it will be preserved and shared. He emphasized using open standards and preparing metadata to help others understand and find the data. Researchers were advised to consider long-term preservation and to partner with libraries or repositories to ensure access over time. Contact information was provided for those needing assistance developing their DMP.
The document discusses requirements for National Science Foundation (NSF) Data Management Plans (DMPs). Starting in 2011, DMPs describing how research data will be organized, preserved, and shared are required as part of NSF grant proposals. DMPs must address data standards, access and sharing policies, and long-term preservation and access. Resources for writing DMPs are provided, including tools, best practices examples, and experts available for consultation.
Presentation from a University of York Library workshop on research data management. The workshop provides an introduction to research data management, covering best practice for the successful organisation, storage, documentation, archiving, and sharing of research data.
Managing data throughout the research lifecycleMarieke Guy
This document summarizes a presentation about managing data throughout the research lifecycle. It discusses the stages of the research lifecycle, including planning, data creation, documentation, storage, sharing, and preservation. It provides examples of research lifecycle models and addresses key questions to consider at each stage, such as what formats to use, how to document data, where to store it, and how to share and preserve it. The presentation emphasizes making informed decisions about data management and talking to colleagues for support and advice.
No Free Lunch: Metadata in the life sciencesChris Dwan
This presentation covers some challenges and makes suggestions to support the work of creating flexible, interoperable data systems for the life sciences.
This slideshow was used in a Preparing Your Research Data for the Future course taught in the Medical Sciences Division, University of Oxford, on 2015-06-08. It provides an overview of some key issues, focusing on long-term data management, sharing, and curation.
IDCC Workshop: Analysing DMPs to inform research data services: lessons from ...Amanda Whitmire
A workshop as part of the International Digital Curation Conference 2016 on DMP development and support. This presentation demonstrates how we can use data management plans as a source of information to better understand researcher data stewardship practices and how to support them. Be sure to see the slide notes to better understand the presentation (most slides are just photos/icons).
This presentation was delivered at the Elsevier Library Connect Seminar on 6 October 2014 in Johannesburg, 7 October 2014 in Durban and 9 October 2014 in Cape Town and gives an overview of the potential role that librarians can play in research data management
DataONE Education Module 03: Data Management PlanningDataONE
Lesson 3 in a set of 10 created by DataONE on Best Practices fo Data Management. The full module can be downloaded from the DataONE.org website at: http://www.dataone.org/educaiton-modules. Released under a CC0 license, attribution and citation requested.
The document discusses requirements for National Science Foundation (NSF) Data Management Plans (DMPs). Starting in 2011, DMPs describing how research data will be organized, preserved, and shared are required as part of NSF grant proposals. DMPs must address data standards, access and sharing policies, and long-term preservation and access. Resources for writing DMPs are provided, including tools, best practices examples, and experts available for consultation.
Presentation from a University of York Library workshop on research data management. The workshop provides an introduction to research data management, covering best practice for the successful organisation, storage, documentation, archiving, and sharing of research data.
Managing data throughout the research lifecycleMarieke Guy
This document summarizes a presentation about managing data throughout the research lifecycle. It discusses the stages of the research lifecycle, including planning, data creation, documentation, storage, sharing, and preservation. It provides examples of research lifecycle models and addresses key questions to consider at each stage, such as what formats to use, how to document data, where to store it, and how to share and preserve it. The presentation emphasizes making informed decisions about data management and talking to colleagues for support and advice.
No Free Lunch: Metadata in the life sciencesChris Dwan
This presentation covers some challenges and makes suggestions to support the work of creating flexible, interoperable data systems for the life sciences.
This slideshow was used in a Preparing Your Research Data for the Future course taught in the Medical Sciences Division, University of Oxford, on 2015-06-08. It provides an overview of some key issues, focusing on long-term data management, sharing, and curation.
IDCC Workshop: Analysing DMPs to inform research data services: lessons from ...Amanda Whitmire
A workshop as part of the International Digital Curation Conference 2016 on DMP development and support. This presentation demonstrates how we can use data management plans as a source of information to better understand researcher data stewardship practices and how to support them. Be sure to see the slide notes to better understand the presentation (most slides are just photos/icons).
This presentation was delivered at the Elsevier Library Connect Seminar on 6 October 2014 in Johannesburg, 7 October 2014 in Durban and 9 October 2014 in Cape Town and gives an overview of the potential role that librarians can play in research data management
DataONE Education Module 03: Data Management PlanningDataONE
Lesson 3 in a set of 10 created by DataONE on Best Practices fo Data Management. The full module can be downloaded from the DataONE.org website at: http://www.dataone.org/educaiton-modules. Released under a CC0 license, attribution and citation requested.
This slideshow was used in an Introduction to Research Data Management course for the Social Sciences Division, University of Oxford, on 2015-05-27. It provides an overview of some key issues, looking at both day-to-day data management, and longer term issues, including sharing, and curation.
Data Management for Research (New Faculty Orientation)aaroncollie
Situates research data management as a contingency that should be addressed and provisioned for during planning and research design. Draws out fundamental practices for file management, data description, and enumerates storage decision points.
This slideshow was used in a Preparing Your Research Material for the Future course taught in the Humanities Division, University of Oxford, on 2014-06-09. It provides an overview of some key issues, focusing on the long-term management of data and other research material, including sharing and curation.
Data management plans and planning - a gentle introductionMartin Donnelly
The document provides an overview of facilitating open science training for European research. It discusses data management plans and planning, including the importance of planning, what a data management plan entails, and examples of DMPs. It also describes the Horizon 2020 DMP pilot program in Europe and requirements for DMPs submitted with grant proposals. Finally, it outlines support resources for developing DMPs and the objectives and methods of the FOSTER project which aims to support the adoption of open access policies in European research.
Data collection is the process of systematically gathering information to answer research questions. Accurate data collection is essential to maintaining research integrity. Issues that can compromise integrity include errors in data collection instruments or procedures. Quality assurance and quality control help ensure integrity. Quality assurance occurs before data collection through standardized protocols and manuals. Quality control occurs during and after collection through review and validation of data. Maintaining integrity supports accurate conclusions and prevents wasted resources.
The document provides an introduction to research data management planning, explaining what a data management plan is, what it should include, and tools and resources available for creating a plan. It discusses the key components of a data management plan such as describing the project and data, handling the data during the project, documentation, long-term preservation, and meeting requirements. Finally, it provides examples of planning tools and resources for developing a data management plan.
This document provides an introduction to data management. It discusses why data management is important, covering key aspects like developing data management plans, file organization, documentation and metadata, storage and backup, legal and ethical considerations, sharing and reuse, and preservation. Effective data management is critical for research success as it supports reproducibility, sharing, and preventing data loss. The document outlines best practices and resources like the library that can help with developing strong data management strategies.
Research Data Management: What is it and why is the Library & Archives Servic...GarethKnight
This document summarizes research data management and the library and archives service's involvement. It defines research data, explains why data needs to be managed, and outlines the key drivers for data management and publication. It then describes the library and archives service's knowledge of data management, the research data management support service being established, and the guidance, training, and tools being developed to help researchers with data management.
February 18 2015 NISO Virtual Conference Scientific Data Management: Caring for Your Institution and its Intellectual Wealth
Using data management plans as a research tool: an introduction to the DART Project
Amanda L. Whitmire, Ph.D., Assistant Professor, Data Management Specialist, Oregon State University Libraries & Press
Going Full Circle: Research Data Management @ University of PretoriaJohann van Wyk
Presentation delivered at the eResearch Africa Conference, held 23-27 November 2014, at the University of Cape Town, Cape Town, South Africa. Various approaches to Research Data Management at Higher Education Institutions focus on an aspect or two of the research data cycle. At the University of Pretoria the approach has been to support researchers throughout the research process covering the whole research data cycle. The idea is to facilitate/capture the research data throughout the research cycle. This will give context to the data and will add provenance to the data. The University of Pretoria uses the UK Data Archive’s research data cycle model, to align its Research Data Management project-development. This model identifies the stages of a research data cycle as: creating data, processing data, analysing data, preserving data, giving access to data, and reusing data. This paper will give a short overview of the chronological development of research data management at the University of Pretoria. The overview will also highlight findings of two surveys done at the University, one in 2009 and one in 2013. This will be followed by a discussion of a number of pilot projects at the University, and how the needs of researchers involved in these projects are being addressed in a number of the stages of the research data cycle. The discussion will also give a short overview of how the University plans to support those stages not currently being addressed. The second part of the presentation will focus on the projects and technology (software and hardware) used. The University of Pretoria has adopted an Enterprise Content Management (ECM) approach to manage its Research Data. ECM is not a singular platform or system but rather a set of strategies, tools and methodologies that interoperate with each other to create a comprehensive management tool. These sets create an all-encompassing process addressing document, web, records and digital asset management. At the University of Pretoria we address all these processes with different software suites and tools to create a complete management system. Each process presented its own technical challenges. These had to be addressed, while keeping in mind the end objective of supporting researchers throughout the whole research process and data life cycle. Various platforms and standards have been adopted to meet the University of Pretoria’s criteria. To date three processes have been addressed namely, the capturing of data during the research process, the dissemination of data and the preservation of data.
Elaine Martin, MSLS, DA, Donna Kafel, RN, MSLS, and Andrew Creamer, MaEd, MSLS of UMass Medical School''s Lamar Soutter Library present Best Practices for Managing Data. The presentation features the importance of managing data for research projects, and tactical best practice initiatives to create a data management and sharing plan, including how to preserve label, secure, store, and preserve data. Issues, such as licensing, data dictionaries, regulations, and metadata are addressed in the presentation.
RDAP 16 Poster: Interpreting Local Data Policies in PracticeASIS&T
Research Data Access and Preservation Summit, 2016
Atlanta, GA
May 4-7, 2016
Poster session (Wednesday, May 4)
Presenters:
Line Pouchard, Purdue University
Donna Ferullo, Purdue University
Introduction to research data managementMichael Day
Slides from a presentation given at the JIBS User Group / RLUK joint event "Demystifying research data: don't be scared, be prepared" held at the SOAS Brunei Gallery, London, 17 July 2012.
Have you implemented a Data Management Plan (DMP) tool at your institution or are you currently involved in discussions to implement one? Would you like to connect with others who are involved in implementing DMPs? Then this webinar is for you!
This webinar will bring together those involved in planning or implementing DMP to exchange information and explore ideas around DMP.
>>>>>>>>>>>>>>>>>>>>>>>>
Kathryn Unsworth and Natasha Simons lead the conversation by starting off with a few thoughts on:
-- a wrap up of the DMP Birds of a Feather session at eResearch Australasia (Oct 2016)
-- DMPs v2
-- discussion around DMPs as Thing 15 in the 23 (Research Data) Things program
-- and some thought provoking ideas.
This section WILL be recorded.
Then open up for discussion - NOT recorded.
We will also be looking to gauge interest in the formation of a DMP Community of Practice in Australia.
>>>>>>>>>>>>>>>>>>>>>>>>
Background:
Significant advocacy and technical enterprise have been directed towards the development and use of DMP tools. However, the agents and motivations driving DMP use differ, presenting use cases to explore and questions to be answered:
-- Why implement a DMP tool?
-- Does DMP use align with an agent’s motivations and more importantly with intended outcomes?
-- What are the expected outcomes?
-- Is there a one-size-fits-all DMP?
-- Is best practice for researchers an aim or a hoped-for by product?
>>>>>>>>>>>>>>>>>>>>>>>>
More info about DMPs: http://www.ands.org.au/working-with-data/data-management/data-management-plans
Australian DMP examples: https://projects.ands.org.au/policy.php
>>>>>>>>>>>>>>>>>>>>>>>>
Contact:
Kathryn.Unsworth@ands.org.au
Natasha.Simons@ands.org.au
This slideshow was used in a Preparing Your Research Material for the Future course for the Humanities Division, University of Oxford, on 2016-11-16. It provides an overview of some key issues, focusing on the long-term management of data and other research material, including sharing and curation.
Introduction to research data management; Lecture 01 for GRAD521Amanda Whitmire
Lesson 1: Introduction to research data management. From a series of lectures from a 10-week, 2-credit graduate-level course in research data management (GRAD521, offered at Oregon State University).
The course description is: "Careful examination of all aspects of research data management best practices. Designed to prepare students to exceed funder mandates for performance in data planning, documentation, preservation and sharing in an increasingly complex digital research environment. Open to students of all disciplines."
Major course content includes: Overview of research data management, definitions and best practices; Types, formats and stages of research data; Metadata (data documentation); Data storage, backup and security; Legal and ethical considerations of research data; Data sharing and reuse; Archiving and preservation.
See also, "Whitmire, Amanda (2014): GRAD 521 Research Data Management Lectures. figshare. http://dx.doi.org/10.6084/m9.figshare.1003835. Retrieved 23:25, Jan 07, 2015 (GMT)"
This document provides an overview of developing a data management plan. It discusses the Digital Curation Centre and the speaker's involvement with DMPs. A DMP is a plan for managing research data throughout the data lifecycle that addresses issues like data capture, documentation, access, storage, backup, and long-term preservation. Developing a DMP ensures good data practices and maximizes data reuse. It also benefits research by making the process more efficient, data more accessible and transparent, and findings more impactful. A DMP typically involves researchers, institutions, partners and other stakeholders. Funders like the European Union also have specific DMP requirements for projects seeking funding.
This document summarizes Brad Houston's presentation on building a simple electronic records workflow. It discusses the benefits of electronic records like improved access and context, but also challenges like volume and preservation. It proposes using a "mechanic metaphor" where individuals have enough knowledge to manage electronic records issues. The presentation outlines using free and open source tools to accession, arrange, describe, and preserve a collection of records from a university chancellor's office. It emphasizes the ongoing nature of digital preservation and provides resources for further information.
Presentation on electronic records management and archival issues. Originally presented at the Fall 2008 meeting of the Southeastern Wisconsin Archivists Group
This slideshow was used in an Introduction to Research Data Management course for the Social Sciences Division, University of Oxford, on 2015-05-27. It provides an overview of some key issues, looking at both day-to-day data management, and longer term issues, including sharing, and curation.
Data Management for Research (New Faculty Orientation)aaroncollie
Situates research data management as a contingency that should be addressed and provisioned for during planning and research design. Draws out fundamental practices for file management, data description, and enumerates storage decision points.
This slideshow was used in a Preparing Your Research Material for the Future course taught in the Humanities Division, University of Oxford, on 2014-06-09. It provides an overview of some key issues, focusing on the long-term management of data and other research material, including sharing and curation.
Data management plans and planning - a gentle introductionMartin Donnelly
The document provides an overview of facilitating open science training for European research. It discusses data management plans and planning, including the importance of planning, what a data management plan entails, and examples of DMPs. It also describes the Horizon 2020 DMP pilot program in Europe and requirements for DMPs submitted with grant proposals. Finally, it outlines support resources for developing DMPs and the objectives and methods of the FOSTER project which aims to support the adoption of open access policies in European research.
Data collection is the process of systematically gathering information to answer research questions. Accurate data collection is essential to maintaining research integrity. Issues that can compromise integrity include errors in data collection instruments or procedures. Quality assurance and quality control help ensure integrity. Quality assurance occurs before data collection through standardized protocols and manuals. Quality control occurs during and after collection through review and validation of data. Maintaining integrity supports accurate conclusions and prevents wasted resources.
The document provides an introduction to research data management planning, explaining what a data management plan is, what it should include, and tools and resources available for creating a plan. It discusses the key components of a data management plan such as describing the project and data, handling the data during the project, documentation, long-term preservation, and meeting requirements. Finally, it provides examples of planning tools and resources for developing a data management plan.
This document provides an introduction to data management. It discusses why data management is important, covering key aspects like developing data management plans, file organization, documentation and metadata, storage and backup, legal and ethical considerations, sharing and reuse, and preservation. Effective data management is critical for research success as it supports reproducibility, sharing, and preventing data loss. The document outlines best practices and resources like the library that can help with developing strong data management strategies.
Research Data Management: What is it and why is the Library & Archives Servic...GarethKnight
This document summarizes research data management and the library and archives service's involvement. It defines research data, explains why data needs to be managed, and outlines the key drivers for data management and publication. It then describes the library and archives service's knowledge of data management, the research data management support service being established, and the guidance, training, and tools being developed to help researchers with data management.
February 18 2015 NISO Virtual Conference Scientific Data Management: Caring for Your Institution and its Intellectual Wealth
Using data management plans as a research tool: an introduction to the DART Project
Amanda L. Whitmire, Ph.D., Assistant Professor, Data Management Specialist, Oregon State University Libraries & Press
Going Full Circle: Research Data Management @ University of PretoriaJohann van Wyk
Presentation delivered at the eResearch Africa Conference, held 23-27 November 2014, at the University of Cape Town, Cape Town, South Africa. Various approaches to Research Data Management at Higher Education Institutions focus on an aspect or two of the research data cycle. At the University of Pretoria the approach has been to support researchers throughout the research process covering the whole research data cycle. The idea is to facilitate/capture the research data throughout the research cycle. This will give context to the data and will add provenance to the data. The University of Pretoria uses the UK Data Archive’s research data cycle model, to align its Research Data Management project-development. This model identifies the stages of a research data cycle as: creating data, processing data, analysing data, preserving data, giving access to data, and reusing data. This paper will give a short overview of the chronological development of research data management at the University of Pretoria. The overview will also highlight findings of two surveys done at the University, one in 2009 and one in 2013. This will be followed by a discussion of a number of pilot projects at the University, and how the needs of researchers involved in these projects are being addressed in a number of the stages of the research data cycle. The discussion will also give a short overview of how the University plans to support those stages not currently being addressed. The second part of the presentation will focus on the projects and technology (software and hardware) used. The University of Pretoria has adopted an Enterprise Content Management (ECM) approach to manage its Research Data. ECM is not a singular platform or system but rather a set of strategies, tools and methodologies that interoperate with each other to create a comprehensive management tool. These sets create an all-encompassing process addressing document, web, records and digital asset management. At the University of Pretoria we address all these processes with different software suites and tools to create a complete management system. Each process presented its own technical challenges. These had to be addressed, while keeping in mind the end objective of supporting researchers throughout the whole research process and data life cycle. Various platforms and standards have been adopted to meet the University of Pretoria’s criteria. To date three processes have been addressed namely, the capturing of data during the research process, the dissemination of data and the preservation of data.
Elaine Martin, MSLS, DA, Donna Kafel, RN, MSLS, and Andrew Creamer, MaEd, MSLS of UMass Medical School''s Lamar Soutter Library present Best Practices for Managing Data. The presentation features the importance of managing data for research projects, and tactical best practice initiatives to create a data management and sharing plan, including how to preserve label, secure, store, and preserve data. Issues, such as licensing, data dictionaries, regulations, and metadata are addressed in the presentation.
RDAP 16 Poster: Interpreting Local Data Policies in PracticeASIS&T
Research Data Access and Preservation Summit, 2016
Atlanta, GA
May 4-7, 2016
Poster session (Wednesday, May 4)
Presenters:
Line Pouchard, Purdue University
Donna Ferullo, Purdue University
Introduction to research data managementMichael Day
Slides from a presentation given at the JIBS User Group / RLUK joint event "Demystifying research data: don't be scared, be prepared" held at the SOAS Brunei Gallery, London, 17 July 2012.
Have you implemented a Data Management Plan (DMP) tool at your institution or are you currently involved in discussions to implement one? Would you like to connect with others who are involved in implementing DMPs? Then this webinar is for you!
This webinar will bring together those involved in planning or implementing DMP to exchange information and explore ideas around DMP.
>>>>>>>>>>>>>>>>>>>>>>>>
Kathryn Unsworth and Natasha Simons lead the conversation by starting off with a few thoughts on:
-- a wrap up of the DMP Birds of a Feather session at eResearch Australasia (Oct 2016)
-- DMPs v2
-- discussion around DMPs as Thing 15 in the 23 (Research Data) Things program
-- and some thought provoking ideas.
This section WILL be recorded.
Then open up for discussion - NOT recorded.
We will also be looking to gauge interest in the formation of a DMP Community of Practice in Australia.
>>>>>>>>>>>>>>>>>>>>>>>>
Background:
Significant advocacy and technical enterprise have been directed towards the development and use of DMP tools. However, the agents and motivations driving DMP use differ, presenting use cases to explore and questions to be answered:
-- Why implement a DMP tool?
-- Does DMP use align with an agent’s motivations and more importantly with intended outcomes?
-- What are the expected outcomes?
-- Is there a one-size-fits-all DMP?
-- Is best practice for researchers an aim or a hoped-for by product?
>>>>>>>>>>>>>>>>>>>>>>>>
More info about DMPs: http://www.ands.org.au/working-with-data/data-management/data-management-plans
Australian DMP examples: https://projects.ands.org.au/policy.php
>>>>>>>>>>>>>>>>>>>>>>>>
Contact:
Kathryn.Unsworth@ands.org.au
Natasha.Simons@ands.org.au
This slideshow was used in a Preparing Your Research Material for the Future course for the Humanities Division, University of Oxford, on 2016-11-16. It provides an overview of some key issues, focusing on the long-term management of data and other research material, including sharing and curation.
Introduction to research data management; Lecture 01 for GRAD521Amanda Whitmire
Lesson 1: Introduction to research data management. From a series of lectures from a 10-week, 2-credit graduate-level course in research data management (GRAD521, offered at Oregon State University).
The course description is: "Careful examination of all aspects of research data management best practices. Designed to prepare students to exceed funder mandates for performance in data planning, documentation, preservation and sharing in an increasingly complex digital research environment. Open to students of all disciplines."
Major course content includes: Overview of research data management, definitions and best practices; Types, formats and stages of research data; Metadata (data documentation); Data storage, backup and security; Legal and ethical considerations of research data; Data sharing and reuse; Archiving and preservation.
See also, "Whitmire, Amanda (2014): GRAD 521 Research Data Management Lectures. figshare. http://dx.doi.org/10.6084/m9.figshare.1003835. Retrieved 23:25, Jan 07, 2015 (GMT)"
This document provides an overview of developing a data management plan. It discusses the Digital Curation Centre and the speaker's involvement with DMPs. A DMP is a plan for managing research data throughout the data lifecycle that addresses issues like data capture, documentation, access, storage, backup, and long-term preservation. Developing a DMP ensures good data practices and maximizes data reuse. It also benefits research by making the process more efficient, data more accessible and transparent, and findings more impactful. A DMP typically involves researchers, institutions, partners and other stakeholders. Funders like the European Union also have specific DMP requirements for projects seeking funding.
This document summarizes Brad Houston's presentation on building a simple electronic records workflow. It discusses the benefits of electronic records like improved access and context, but also challenges like volume and preservation. It proposes using a "mechanic metaphor" where individuals have enough knowledge to manage electronic records issues. The presentation outlines using free and open source tools to accession, arrange, describe, and preserve a collection of records from a university chancellor's office. It emphasizes the ongoing nature of digital preservation and provides resources for further information.
Presentation on electronic records management and archival issues. Originally presented at the Fall 2008 meeting of the Southeastern Wisconsin Archivists Group
Email Management for Office 365 and BeyondBrad Houston
This document provides guidance on email management for Microsoft 365 and beyond. It breaks emails down into five categories: non-records, transitory, routine, other records, and historical. Non-records and transitory emails such as spam and calendar invites can be deleted. Routine emails related to ongoing conversations or transactions should be deleted after 6 months. Other records requiring longer retention should be exported according to retention schedules. Historical emails setting policy should be exported to archives. The document recommends using filters and folders to sort emails and adopting a "touch once" method to reduce clutter in inboxes.
This document discusses personal digital archiving, including what it is, why it's important, who the key players are, when related events may take place, how people can get involved, where to find more information, and it addresses other potential questions. The document provides guidance on best practices for individuals to select, preserve, and organize their digital materials for archival purposes.
The document discusses NSF requirements for data management plans for grant proposals. It notes that as of January 2011, proposals must include a data management plan that addresses how data will be organized, preserved, and shared. The plan must provide enough detail for reviewers to understand how data will be managed during and after the project. Guidelines are provided on the key elements to address in a data management plan, including what data will be collected, how it will be formatted and documented, how others can access and use the data, and how the data will be preserved long-term. Resources for developing effective data management plans are suggested.
Microfilm or Digitize: Which is Right for You?Brad Houston
Presentation on reformatting options for active and inactive records. Originally presented at the 2009 Annual Conference of the International Institute of Municipal Clerks, May 20, 2009
The document discusses requirements for data management plans from the National Science Foundation. It notes that as of January 2011, NSF will require a data management plan for all new grant proposals as well as existing grants. The plan must address what data will be collected and how it will be organized, preserved, shared, and accessed. It emphasizes the importance of effective data management for facilitating research by both the principal investigators and other researchers. The document provides guidance on developing a data management plan that meets NSF's criteria and effectively manages research data.
Funder requirements for Data Management PlansSherry Lake
This document discusses funder requirements for data management and sharing. It notes that major funders like the National Science Foundation (NSF) and National Institutes of Health (NIH) require applicants to submit a data management plan. These plans describe how research data will be organized, preserved, and shared. The document provides details on what funders expect to see in a data management plan, including a description of the data, metadata standards, data access and sharing policies, and plans for long-term data preservation. It also lists other funders that require applicants to have a data management or sharing plan.
This document summarizes strategies for creating data management plans and developing sustainable research data management services. It discusses defining research data and data management, federal public access mandates from agencies like NIH and NSF, resources for librarians, workflows for data management plan consultations, and developing scalable research data management services. It provides an overview of common elements to include in data management plans, such as data products, repositories, metadata, documentation, and access, and lessons learned from establishing research data management services at one university.
- The document summarizes a workshop on research data management given by Stephanie Simms from the California Digital Library.
- It discusses an overview of research data management and the "SupportYour Data" program, which aims to help researchers better organize, save, document, and share the outputs of their work.
- The workshop covered assessing current data management practices, accessing tools and resources, and data-related services available at Kyoto University.
NSF Data Requirements and Changing Federal Requirements for ResearchMargaret Henderson
This document discusses NSF requirements for data management plans and sharing research data. It provides an overview of what NSF expects to be included in a data management plan, such as the types of data produced, data standards, storage and preservation plans, policies for access and sharing, and archiving data for long-term access. The document also mentions other funder and government policies regarding public access to published research and supporting data. Resources for creating data management plans and sharing data, such as the DMPTool and research data repositories, are also introduced.
Scholars and researchers are being asked by an increasing number of research sponsors and journals to outline how they will manage and share their research data. This is an introduction to data management and sharing practices with some specific information for Columbia University researchers.
NIH Grants and Data: New Rules Coming in 2023Erin Owens
The document discusses new rules from the National Institutes of Health (NIH) requiring funding applicants to include a Data Management and Sharing Plan (DMSP) beginning in January 2023. The new policy aims to improve data sharing and management practices for NIH-funded research. Under the new rules, applicants must describe how they will preserve, access, and share scientific data generated by their research. The DMSP must address six elements: data type, related tools/software, standards, data preservation/access timelines, access considerations, and oversight. The goal is to maximize research outcomes while supporting rigor and reproducibility.
This document discusses data management practices for researchers. It defines what constitutes data, such as observations, experiments, simulations, and documents. It outlines the roles of librarians in advising on data management plans, metadata practices, and archiving data. It also discusses why data management is important for validation, replication of research, and compliance with funder requirements. The document provides examples of file structures, naming conventions, metadata, codebooks, and archiving data in institutional repositories to facilitate long-term access and reuse of research data.
This document provides an introduction to the National Science Foundation's (NSF) data policies and the Indiana University-Purdue University Indianapolis (IUPUI) University Library's data services program. It summarizes NSF's policies on disseminating and sharing research data, including requirements for submitting a data management plan with grant proposals. The document then outlines best practices for addressing different components of a data management plan, such as describing your data, standards, metadata, access and sharing policies, long-term preservation, and roles and responsibilities. Contact information is provided for the Digital Scholarship and Data Management Librarian for questions.
Meeting the NSF DMP Requirement June 13, 2012IUPUI
The document provides guidance on developing a data management plan (DMP) to meet requirements for National Science Foundation grant proposals. It discusses the context and rationale for federal data policies, defines the key elements required for a DMP, and provides examples of DMPs for different types of research data. The main points are: understanding the NSF data policy aims to increase research impact and data sharing/reuse; a DMP must address the types of data generated, metadata standards, data access/sharing plans, long-term preservation, and associated costs; and good planning helps ensure data remains accessible, usable and preserved into the future. Resources and guidance are available to help researchers develop robust and fundable DMPs.
Introduction to research data managementdri_ireland
An Introduction to Research Data Management: slides from a presentation given online on May 12 2022, by Beth Knazook, Project Manager, Research Data. Covers topics such as: what are research data; why share research data; why DMPs are important; and where should you share your data?
The format for the data management plans for PhD students at Wagenigen UR explained. This format was developed by the library in cooperation with the Wageningen Graduate Schools.
Data Management for Postgraduate students by Lynn Woolfreypvhead123
This document discusses research data management for postgraduates. It explains that research data management refers to storing, accessing, and preserving research data. It notes that funders and universities now require data management plans for funding proposals and research. The document provides reasons for doing research data management, such as ensuring long-term data preservation, preventing fraud, and enabling data reuse. It outlines elements to include in a data management plan and resources for writing plans. The document advises that data services can help take the burden of research data management off researchers.
Meeting Federal Research Requirements for Data Management Plans, Public Acces...ICPSR
These slides cover evolving federal research requirements for sharing scientific data. Provided are updates on federal agency responses to the 2013 OSTP memo, guidance on data management plans, resources for data management and curation training for staff/researchers, and tips for evaluating public data-sharing services. ICPSR's public data-sharing service, openICPSR, is also presented. Recording of this presentation is here: https://www.youtube.com/watch?v=2_erMkASSv4&feature=youtu.be
This presentation was provided by Maria Praetzellis of California Digital Library, during the NISO hot topic virtual conference "Effective Data Management," which was held on September 29, 2021.
The document provides guidance on early planning for data management, including becoming familiar with funder requirements, planning for the types and formats of data that will be created, designing a system for taking notes, organizing files through consistent naming schemes and use of folders, adding metadata to files to aid in documentation and discovery, and using RSS feeds to organize web-based information. It also touches on issues like plagiarism, data protection, intellectual property rights, and remote access to and backup of data.
This slideshow was used in a Research Data Management Planning course taught at IT Services, University of Oxford, on 2015-11-04. It provides an overview of the elements of a data management plan, plus an introduction to some tools that can be used to build one.
Feb 26 NISO Training Thursday
Crafting a Scientific Data Management Plan
About the Training
Addressing a data management plan for the first time can be an intimidating exercise. Join NISO for a hands-on workshop that will guide you through the elements of creating a data management plan, including gathering necessary information, identifying needed resources, and navigating potential pitfalls. Participants explore the important components of a data management plan and critique excerpts of sample plans provided by the instructors.
This session is meant to be a guided, step-by-step session that will follow the February 18 NISO Virtual Conference, Scientific Data Management: Caring for Your Institution and its Intellectual Wealth.
About the Instructors
Kiyomi D. Deards, MSLIS, Assistant Professor, University of Nebraska-Lincoln Libraries
Jennifer Thoegersen, Data Curation Librarian, University of Nebraska-Lincoln Libraries
This document discusses challenges and potential solutions for improving data sharing in neuroscience. It notes that while there is a large amount of neuroscience data, it is unevenly distributed across repositories and databases. The document proposes creating a distributed "data sharing ecosystem" where data and related metadata are systematically tracked, linked and made available. Key elements would include unique IDs for all data objects, logging all activities, and developing accountability scores and influence measures to promote better data citizenship. However, concerns are raised about monitoring researchers and potential biases, which would need to be addressed for such a system to work.
Reading the Library General Records ScheduleBrad Houston
This document provides guidance on records management for university libraries. It summarizes the University of Wisconsin System Library General Records Schedule (GRS), which establishes minimum retention periods for various types of library records. The GRS covers 44 record series organized into 9 categories. The document explains how to identify the appropriate record series, retention periods, and disposition of records. It also provides guidance on managing electronic records and records requiring confidential destruction.
Brad Houston provides a presentation on records management laws and policies for UWM employees. He discusses relevant laws like FERPA, HIPAA, and copyright law. The presentation emphasizes the importance of properly handling records requests, litigation holds, and electronic records. Employees are advised to contact legal affairs if they have questions about complying with records laws and policies.
The document discusses challenges with managing electronic records, including massive volume, unnecessary copies, and lack of control over organization. It provides guidance on identifying records, ensuring authenticity through versioning or digital signatures, and creating a consistent filing system using descriptive names and tags. The document stresses the importance of complying with records retention and disposition authorities to systematically preserve and destroy records according to legal requirements.
Finding and Reading General Records SchedulesBrad Houston
This document provides an overview of records management at the University of Wisconsin-Milwaukee. It discusses what constitutes a record, records retention schedules, the roles of official and unofficial copies, how schedules are developed, and resources for records management guidance. Key contacts and retention periods are listed for common record types like committee minutes, personnel files, and fiscal records.
This document provides an overview of records management basics, including definitions of key terms, the importance of proper records management, the records lifecycle, and how the University of Wisconsin-Milwaukee Records Management program can assist offices in managing both physical and electronic records. It explains that records management ensures efficiency, compliance with legal requirements, and preservation of institutional history. The document outlines the records lifecycle of creation, use, maintenance, and final disposition or archival retention. It also addresses electronic records, records scheduling, transfers to archives, and records retrieval services.
The document provides guidance on identifying, organizing, preserving, and managing email records for state employees. It notes that 90% of new records are created electronically and email comprises most new electronic records. It emphasizes the need to identify email that are records versus non-records, and to utilize tools to organize, retain, and dispose of email records appropriately according to retention schedules.
The document discusses best practices for managing electronic records (e-records) in university offices. It recommends treating e-records the same as paper records by (1) identifying which files are official records using criteria like supporting transactions or documenting policies, (2) organizing records in a consistent filing system like folders on your computer, and (3) following records retention and disposition authorities (RDDA) to determine how long to keep records and when they can be destroyed. The document also provides tips for long-term preservation of e-records, such as converting files to neutral formats and storing them remotely.
The document provides an overview of records management basics and the records life cycle. It discusses why records management is important both legally and administratively. Records must be properly managed and retained or destroyed according to approved records retention schedules. Electronic records and email pose special challenges and must be managed according to state requirements. The University Records Management program can assist with records scheduling, transfers, destruction and reference requests.
Executive Directors Chat Leveraging AI for Diversity, Equity, and InclusionTechSoup
Let’s explore the intersection of technology and equity in the final session of our DEI series. Discover how AI tools, like ChatGPT, can be used to support and enhance your nonprofit's DEI initiatives. Participants will gain insights into practical AI applications and get tips for leveraging technology to advance their DEI goals.
How to Fix the Import Error in the Odoo 17Celine George
An import error occurs when a program fails to import a module or library, disrupting its execution. In languages like Python, this issue arises when the specified module cannot be found or accessed, hindering the program's functionality. Resolving import errors is crucial for maintaining smooth software operation and uninterrupted development processes.
The simplified electron and muon model, Oscillating Spacetime: The Foundation...RitikBhardwaj56
Discover the Simplified Electron and Muon Model: A New Wave-Based Approach to Understanding Particles delves into a groundbreaking theory that presents electrons and muons as rotating soliton waves within oscillating spacetime. Geared towards students, researchers, and science buffs, this book breaks down complex ideas into simple explanations. It covers topics such as electron waves, temporal dynamics, and the implications of this model on particle physics. With clear illustrations and easy-to-follow explanations, readers will gain a new outlook on the universe's fundamental nature.
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty,
International FDP on Fundamentals of Research in Social Sciences
at Integral University, Lucknow, 06.06.2024
By Dr. Vinod Kumar Kanvaria
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
Physiology and chemistry of skin and pigmentation, hairs, scalp, lips and nail, Cleansing cream, Lotions, Face powders, Face packs, Lipsticks, Bath products, soaps and baby product,
Preparation and standardization of the following : Tonic, Bleaches, Dentifrices and Mouth washes & Tooth Pastes, Cosmetics for Nails.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
A workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
How to Add Chatter in the odoo 17 ERP ModuleCeline George
In Odoo, the chatter is like a chat tool that helps you work together on records. You can leave notes and track things, making it easier to talk with your team and partners. Inside chatter, all communication history, activity, and changes will be displayed.
Assessment and Planning in Educational technology.pptxKavitha Krishnan
In an education system, it is understood that assessment is only for the students, but on the other hand, the Assessment of teachers is also an important aspect of the education system that ensures teachers are providing high-quality instruction to students. The assessment process can be used to provide feedback and support for professional development, to inform decisions about teacher retention or promotion, or to evaluate teacher effectiveness for accountability purposes.
Strategies for Effective Upskilling is a presentation by Chinwendu Peace in a Your Skill Boost Masterclass organisation by the Excellence Foundation for South Sudan on 08th and 09th June 2024 from 1 PM to 3 PM on each day.
2. Document describing data (and/or digital
materials) that have been or will be gathered in
a study or project.
Often includes details on how data will be
organized, preserved, and accessed
Facilitates re-use of data sets by either PI or
other researchers
Required component of grants for MANY
agencies (NSF and NIH)
3. Starting January 2011 for NEW, non-
collaborative proposals
Not voluntary – “integral part” of proposal
Data Management Plans for all data resulting
from any level of NSF funding
Supplementary 2-page document (max)
Optional: Also part of 15-page (max) Project
Description
4. Must address both physical and digital data
“Efficiency and effectiveness” of the DMP will
be considered by NSF and disciplinary division
or directorate
Must include sufficient information that peer
reviewers and project monitors can assess
present proposal and past performance
As of January 2011, proposals will not be
accepted without an accompanying data plan!
5. Such dissemination of data is necessary for the
community to stimulate new advances as quickly as
possible and to allow prompt evaluation of the results
by the scientific community. “ – NSF (italics mine)
Part of Openness trend in federal government
(data.gov - Open Government Initiative)
NIH Public Access Policy (2008)
Public access to federally funded research hearings
- Information Policy, Census and National Archives
Subcommittee of U.S. Congress (July, 2010)
6. It makes your research easier!
Data available in case you need it later
Helps avoid accusations of fraud or bad science
To share it for others to use and learn from
To get credit for producing it
To keep from drowning in irrelevant stuff
... especially at grant/project end
7. Gene expression microarray data: “Publicly
available data was significantly (p=0.006)
associated with a 69% increase in citations,
independently of journal impact factor, date of
publication, and author country of origin.”
Piwowar, Heather et al. “Sharing detailed research
data is associated with increased citation rate.” PLoS
One 2010. DOI: 10.1371/journal.pone.0000308
Maybe there’s an advantage here!
8. Discuss specific requirements for NSF
Data Management plans
Suggest ways to manage, share, and
archive data more effectively
Provide resources for more information
10. What data are you collecting or making?
Can it be recreated? How much would that cost?
How much of it? How fast is it growing? Does it
change?
What file format(s)?
What’s your infrastructure for data collection and
storage like?
How do you find it, or find what you’re looking
for in it?
How easy is it to get new people up to speed? Or
share data with others?
11. Who are the audiences for your data?
You (including Future You), your lab colleagues
(including future ones), your PIs
Disciplinary colleagues, at your institution or at others
Colleagues in allied disciplines
The world!
What are your obligations to others?
Funder requirements
Confidentiality issues
IP questions
Security
12. How do you and your lab get from where you
are to where you need to be?
Document, document, document all decisions and
all processes!
Secret sauce: the more you strategize upfront,
the less angst and panic later.
“Make it up as you go along” is very bad practice!
But the best-laid plans go agley... so be flexible.
And watch your field! Best practices are still in flux.
13. Four kinds of data defined by OMB:
Observational
Examples: Sensor data, telemetry, survey data, sample
data, neuroimages.
Experimental
Examples: gene sequences, chromatograms, toroid
magnetic field data.
Simulation
Examples: climate models, economic models.
Derived or compiled
Examples: text and data mining, compiled database, 3D
models, data gathered from public documents.
14. Preliminary analyses
Raw data is included in this definition
Drafts of scientific papers
Plans for future research
Peer reviews or communications with
colleagues
Physical objects, such as gel samples
15. As early as possible, but no later than
guidelines laid down by relevant Directorate
Engineering Section: “no later than the acceptance
for publication of the main findings of the final data”
Earth Sciences: “No later than two (2) years after the
data were collected.”
Social and Economic Sciences: “within one year after
the expiration of an award”
Be aware of concerns that may require earlier
or later disclosure
FERPA? Human Subjects data? HIPAA?
16. Again, specific retention periods will depend
on the type of data and the grant program
Example: NSF Engineering Section suggests
retention period of “three years after either
completion of the grant project or public release of
research data, whichever is later”
Certain types of data will need to be retained
longer
Patent data, longitudinal data sets, etc.
Ask: is your data of permanent value?
17. Analyzed data (incl. images, tables and tables of
numbers used for making graphs)
Metadata that defines how data was generated,
such as experiment descriptions, computer code,
and computer-calculation input
18. Investigators are expected to preserve/share
primary data, samples, physical collections, &
supporting materials
Provide easily accessible information about data
holdings, including quality assessments and
guidance/finding aids
Data may be made available through submission to
national data center, publication in journal, book, or
accessible website of institutional archives
20. All submitted plans must include, at
minimum:
1. Expected Data: types, physical/electronic collections,
materials to be produced
2. Standards for data and metadata format and content
3. Policies for access and sharing, including provisions for
appropriate protection of privacy, confidentiality,
security, intellectual property, etc.
4. Policies and provisions for re-use, re-distribution, and
the production of derivatives
5. Plans for archiving data, samples, and other research
products, and for preservation of access to them
21. In short: What kind of data will be produced by
your research processes?
Keep in mind:
File formats of complete data sets
Any software or code that will be needed/produced
Physical samples or other individual data points
Some divisions require retention of physical samples;
consult your Program Officer
22. In short: how will you organize your data within
datasets to make it widely accessible, and how
will you make data sets identifiable?
Keep in mind:
Any data formatting standards for your particular
discipline
Any metadata (author, date, subject, etc.) that your
program attaches automatically, and what you will
need to attach manually
How will you find your data for later consultation?
How will others find it?
23. In short: How will you allow other researchers
to find and use your data?
Keep in mind:
How will other researchers find your data? (i.e. How
will you publicize its existence?)
How will you provide access to your data?(CD-RW?
Data Repository? Download via pantherFILE?)
How will you prepare your data for sharing?
Do you need to depersonalize or declassify anything?
24. Data Management Plans are required even if a
project is not expected to generate data that
requires sharing
DMP should clearly explain non-sharing in
light of COI standards (peer review)
Between the lines: Not sharing will require
justification and close scrutiny by NSF
Sharing is preferred
25. In short: How will researchers obtain the
appropriate permissions to use your data?
Keep in mind:
Is a blanket permissions statement or a case-by-case
policy more efficient/practical?
What responsibilities will users of your data have re:
privacy, intellectual property, etc.?
How will you deal with users who violate these
provisions?
26. In short: How will you make sure your data
stays intact and available once you are done
using it?
Keep in Mind:
What are your retention requirements? Is this a
permanent data set?
What storage media will you use? Are you prepared
to migrate/emulate as needed?
Do you have a data backup plan?
28. Think about where you will put your data
Local? Network drive? Online data management
system?
Think about how you (or others) will find your
data
Think about how others may use your data, when
found
Think about how to store your data in the long
term (or if to store it long-term at all)
29. Will anybody be able to read these files at the
end of your time horizon?
Where possible, prefer file formats that are:
Open, standardized
Documented
In wide use
Easy to data-mine, transform, recast
If you need to transform data for durability,
do it now, not later.
30. Fundamental question: What would someone
unfamiliar with your data need in order to
find, evaluate, understand, and reuse them?
Consider the differences between someone
inside your lab, someone outside your lab but
in your field, and someone outside your field.
Two parts: metadata and methods
31. About the project
Title, people, key dates, funders and grants
About the data
Title, key dates, creator(s), subjects, rights, included
files, format(s),versions, checksums
Interpretive aids: codebooks, data dictionaries,
algorithms, code
Keep this with the data– think of it as a
Readme file
32. Reason #1 for not reusing someone else’s data: “I
don’t know enough about how it was gathered to
trust it.”
Document what you did. (A published article may
or may not be enough.)
Document any limitations of what you did.
If you ran code on the data, document the code and
keep it with the data.
Need a codebook? Or a data dictionary?
If I can’t identify at sight what each bit of your dataset
means, yes, you do need a codebook or data dictionary.
DO NOT FORGET UNITS!
33. Your own drive (PC, server, flash drive, etc.)
And if you lose it? Or it breaks?
Somebody else’s drive
Departmental or campus drive
“Cloud” drive
Do they care as much about your data as you do?
What about versioning?
Library motto: Lots Of Copies Keeps Stuff Safe.
Two onsite copies, one offsite copy.
Keep confidentiality and security requirements in
mind, of course
34. If data need to persist beyond project end, you have to
deal with a new kind of risk: organizational risk.
Servers come and go. So do labs. So do entire departments.
This is especially important if you share data! Don’t let it 404!
You need to find a trustworthy partner.
On campus: try the library or Research and Sponsored
Programs. (UITS has a role but can’t do it alone!)
Off campus: look for a disciplinary data repository, or a journal
that accepts data. (It’s a good idea to do this as part of your
planning process.)
Let somebody else worry! You have new projects to get
on with.
36. Informational websites
UW-Madison: http://researchdata.wisc.edu/
UW-Milwaukee: http://dataplan.uwm.edu
Don’t just use the site for your own campus!
Data experts
IT cyberinfrastructure experts
Archivists/records managers
MINDS@UW: minds.wisconsin.edu
Data in final form that make sense as discrete files
37. For Information:
NSF Grant Proposal Guide
http://www.nsf.gov/pubs/policydocs/pappguide/nsf
11001/gpg_index.jsp
MIT Data Management and Publishing
http://libraries.mit.edu/guides/subjects/data-
management/index.html
For storage/management (non-inclusive):
A partial list of potential repositories:
http://databib.org
Ask: can my home institution provide better service?
38. For assistance with writing your plan:
California Digital Library DMP Creation Tool
https://dmp.cdlib.org/ (Select “UWM” as institution)
Data Conservancy DMP Template/Questionnaire
http://dataconservancy.org/dataManagementPlans
DataONE Best Practices Examples
http://www.dataone.org/plans
Data Curation Profiles (Purdue University)
http://datacurationprofiles.org/
Digital Curation Center Tools Catalog
http://www.dcc.ac.uk/resources/external/tools-
services
39. Make sure your data plan covers at least the
minimum requirements set out by NSF
Create appropriate metadata to help you
manage and find data
Use open, universal standards and file formats
Be prepared to preserve access tools along with
data itself
Be aware of time periods for data sharing and
retention
40. Contact the presenter
Brad Houston, UW-Milwaukee
houstobn@uwm.edu (414) 229-6979
This presentation available online at:
http://www.slideshare.net/herodotusjr/data-
management-plans-dmp-for-nsf