This document provides information about a course on research data management. The course covers topics like data management planning, writing data management plans, data citation, data archiving, and legal issues related to data management. It includes an agenda with presentations, activities, and discussions on these topics led by data librarians and legal advisors. The goal is to teach researchers the importance of proper research data management practices.
This document provides guidance on managing research data. It discusses planning ahead by considering data needs, formats, volume and ethics. It also covers organizing data through file naming, metadata, references, remote access and safekeeping. Preserving data involves determining what to keep/delete and using long-term storage such as repositories. Reasons for sharing data include scientific integrity, funding mandates and increasing impact, while reasons for not sharing include financial or sensitive personal information.
This document provides an overview of a workshop on good practice in research data management held at the University of Tartu, Estonia. The workshop covered various topics including defining research data, research data management and data management plans, organizing and documenting data, file formats and storage, metadata, security, and sharing and preserving data. The workshop was led by Stuart Macdonald from the University of Edinburgh and included presentations, introductions, and discussions around each of these research data management topics.
This presentation was delivered at the Elsevier Library Connect Seminar on 6 October 2014 in Johannesburg, 7 October 2014 in Durban and 9 October 2014 in Cape Town and gives an overview of the potential role that librarians can play in research data management
University of Bath Research Data Management training for researchersJez Cope
Slides from a workshop on Research Data Management for research staff and students at the University of Bath.
Part of the Research360 project (http://blogs.bath.ac.uk/research360).
Authors: Cathy Pink and Jez Cope, University of Bath
A basic course on Research data management, part 1: what and whyLeon Osinski
A basic course on research data management for PhD students. The course consists of 4 parts. The course was given at Eindhoven University of Technology (TUe), 24-01-2017
A basic course on Research data management, part 4: caring for your data, or ...Leon Osinski
A basic course on research data management for PhD students. The course consists of 4 parts. The course was given at Eindhoven University of Technology (TUe), 24-01-2017
This presentation discusses managing research data through the data life cycle. It begins with an overview of the research life cycle and embedding the data life cycle within it. Key aspects of data management are then covered, including why manage data, ethical and legal issues, requirements for data sharing and retention, and creating a data management plan. The rest of the presentation delves into each stage of the data life cycle, providing best practices for data collection, organization, security, storage, documentation, processing, analysis, and long-term preservation or sharing. File formats, metadata, repositories, and bibliographic resources are also addressed.
This document provides guidance on managing research data. It discusses planning ahead by considering data needs, formats, volume and ethics. It also covers organizing data through file naming, metadata, references, remote access and safekeeping. Preserving data involves determining what to keep/delete and using long-term storage such as repositories. Reasons for sharing data include scientific integrity, funding mandates and increasing impact, while reasons for not sharing include financial or sensitive personal information.
This document provides an overview of a workshop on good practice in research data management held at the University of Tartu, Estonia. The workshop covered various topics including defining research data, research data management and data management plans, organizing and documenting data, file formats and storage, metadata, security, and sharing and preserving data. The workshop was led by Stuart Macdonald from the University of Edinburgh and included presentations, introductions, and discussions around each of these research data management topics.
This presentation was delivered at the Elsevier Library Connect Seminar on 6 October 2014 in Johannesburg, 7 October 2014 in Durban and 9 October 2014 in Cape Town and gives an overview of the potential role that librarians can play in research data management
University of Bath Research Data Management training for researchersJez Cope
Slides from a workshop on Research Data Management for research staff and students at the University of Bath.
Part of the Research360 project (http://blogs.bath.ac.uk/research360).
Authors: Cathy Pink and Jez Cope, University of Bath
A basic course on Research data management, part 1: what and whyLeon Osinski
A basic course on research data management for PhD students. The course consists of 4 parts. The course was given at Eindhoven University of Technology (TUe), 24-01-2017
A basic course on Research data management, part 4: caring for your data, or ...Leon Osinski
A basic course on research data management for PhD students. The course consists of 4 parts. The course was given at Eindhoven University of Technology (TUe), 24-01-2017
This presentation discusses managing research data through the data life cycle. It begins with an overview of the research life cycle and embedding the data life cycle within it. Key aspects of data management are then covered, including why manage data, ethical and legal issues, requirements for data sharing and retention, and creating a data management plan. The rest of the presentation delves into each stage of the data life cycle, providing best practices for data collection, organization, security, storage, documentation, processing, analysis, and long-term preservation or sharing. File formats, metadata, repositories, and bibliographic resources are also addressed.
Good (enough) research data management practicesLeon Osinski
Slides of a lecture on research data management (RDM), given for 3rd year students (Eindhoven University of Technology, major Psychology & Technology), as part of the course 0HV90 Quantitative Research. At the end of the slides a handy summary 'Research data management basics in a nutshell' is added.
Planning for Research Data Management: 26th January 2016IzzyChad
This document provides an overview of a session on planning for research data management. It discusses what research data management is, why it is important, and walks through the steps for creating a data management plan. The presenter explains the benefits of effective data management, such as helping researchers work more efficiently and enabling data sharing. Key aspects of a data management plan are also outlined, including describing the data, addressing ethics and intellectual property, determining how data will be stored and preserved, and making plans for data sharing and access.
The document provides an overview of the Donders Repository, which aims to securely store original research data, document the research process, and make data accessible to researchers and the public. It describes the procedural design including different roles, collection types, and states. The technical architecture is based on IRODS software and scalable storage. The repository fits into researchers' workflows and supports the timeline of projects from initiation to data sharing. Standards like BIDS help make neuroimaging data FAIR (Findable, Accessible, Interoperable, Reusable).
This slideshow was used in a Preparing Your Research Material for the Future course taught in the Humanities Division, University of Oxford, on 2014-06-09. It provides an overview of some key issues, focusing on the long-term management of data and other research material, including sharing and curation.
Brad Houston presented information on data management plans (DMPs) required by the National Science Foundation (NSF) for grant proposals. He explained that DMPs must describe the data to be collected or generated, how it will be organized and formatted, and how it will be preserved and shared. He emphasized using open standards and preparing metadata to help others understand and find the data. Researchers were advised to consider long-term preservation and to partner with libraries or repositories to ensure access over time. Contact information was provided for those needing assistance developing their DMP.
Rebecca Raworth presented a workshop on research data management. The presentation covered:
- Why research data management plans are important, such as satisfying funder requirements and increasing research efficiency.
- Current requirements for data management plans in Canada.
- Tools for research data management, including Portage for creating data management plans and Dataverse for data storage and access.
- Best practices for organizing, documenting, storing and sharing research data, including using metadata standards, file naming conventions, and choosing appropriate data repositories.
Data Management in the context of Open Science.
Because open access become mandatory for publications and project-funded research data, it is the responsibility of each researcher to be informed and then trained in new practices.
The webinar discussed FAIRDOM services that can help applicants to the ERACoBioTech call with their data management plans and requirements. FAIRDOM offers webinars on developing data management plans, and their platform and tools can help with organizing, storing, sharing, and publishing research data and models in a FAIR manner by utilizing metadata standards. Different levels of support are available, from general community resources through their hub, to premium customized support for individual projects. Consortia can include FAIRDOM as a subcontractor within the guidelines of the ERACoBioTech call.
The document discusses recommendations from a workshop on peer review of research data. It focuses on three key areas:
1. Connecting data review with data management planning by requiring data sharing plans, ensuring adequate funding for data management, and refusing publication without clear data access.
2. Connecting scientific and technical review with data curation by linking articles and data with versioning, avoiding duplicate review efforts, and addressing issues found in data.
3. Connecting data review with article review by requiring methods/software information, providing review checklists, ensuring data access for reviewers, and permanent dataset identifiers from repositories.
Here are the results of the dotmocracy voting:
- "Libraries are the best departments at universities to take on research data archiving." Received the most dots.
- "High cost research facilities should be obliged to share (and preserve) their data." Received the second most dots.
- "Each dataset should also include the data in its rawest form." Received the third most dots.
The top three propositions that received the most votes were:
1. Libraries are the best departments at universities to take on research data archiving.
2. High cost research facilities should be obliged to share (and preserve) their data.
3. Each dataset should also include the data in its
This document provides guidance on managing research data. It discusses planning ahead to consider data needs, formats, and volume. It emphasizes organizing data through file naming, metadata, references, email, and remote access. It stresses preserving data by determining what to keep/delete, using long-term storage such as repositories or archives. Finally, it examines reasons to share data such as scientific integrity, funding mandates, and increasing impact and collaboration.
Using Open Science to advance science - advancing open data Robert Oostenveld
This document discusses using open science practices like open data to advance science. It notes the benefits of open data like improved reproducibility and opportunities for data mining. However, sharing neuroimaging and other human subject data presents challenges regarding data size, sensitivity, and privacy regulations. The document promotes using the Brain Imaging Data Structure (BIDS) format to organize data in an open, standardized way. It also discusses the gradient between personal/identifiable data that requires protection and de-identified research data that can be shared, as well as legal constraints and appropriate repositories for sharing data responsibly.
This document provides an introduction to the National Science Foundation's (NSF) data policies and the Indiana University-Purdue University Indianapolis (IUPUI) University Library's data services program. It summarizes NSF's policies on disseminating and sharing research data, including requirements for submitting a data management plan with grant proposals. The document then outlines best practices for addressing different components of a data management plan, such as describing your data, standards, metadata, access and sharing policies, long-term preservation, and roles and responsibilities. Contact information is provided for the Digital Scholarship and Data Management Librarian for questions.
The document discusses requirements for data management plans from the National Science Foundation. It notes that as of January 2011, NSF will require a data management plan for all new grant proposals as well as existing grants. The plan must address what data will be collected and how it will be organized, preserved, shared, and accessed. It emphasizes the importance of effective data management for facilitating research by both the principal investigators and other researchers. The document provides guidance on developing a data management plan that meets NSF's criteria and effectively manages research data.
This document discusses implementing Linked Data in low resource conditions. It begins by outlining goals of providing a high-level view of Linked Data, identifying possible bottlenecks due to limited resources, and offering suggestions to overcome bottlenecks based on experience. It then defines what is meant by "low-resource conditions", including limited IT competencies, software, hardware, electricity, internet access. The document outlines the Linked Data workflow and discusses each step in more detail, including data generation, conversion to RDF, data storage, maintenance, linking, and exposure. It highlights the example of AGRIS, a collaborative Linked Data application, and emphasizes starting small, being strategic, reusing existing resources, and collaborating to maximize resources in low
FAIRDOM data management support for ERACoBioTech ProposalsFAIRDOM
This document provides information about a webinar from the FAIRDOM Consortium on data management for ERACoBioTech full proposals. It includes:
- Details on how to budget for and include a data management plan in proposals
- A checklist for developing a data management plan covering topics like the types and volumes of data, data sharing and reuse, and making data FAIR
- An overview of the FAIRDOM services and software platform that can help with project data management and stewardship
Mark Baker, president and CEO of the National Aircraft Association (NAA), announced that Sean Gaffney will be stepping down from his role as executive vice president of regulatory and technical affairs at the end of June. Gaffney had worked at NAA for 13 years in roles of increasing responsibility and helped lead several regulatory and legislative initiatives. His replacement has yet to be named by the NAA.
Este documento resume varios temas relacionados con sistemas operativos y software, incluyendo las ventajas de Linux sobre Windows, la historia y características de Android, y descripciones generales de Microsoft Office, hardware, lenguajes de programación, y Mac OS. También incluye una sección de créditos al final.
Good (enough) research data management practicesLeon Osinski
Slides of a lecture on research data management (RDM), given for 3rd year students (Eindhoven University of Technology, major Psychology & Technology), as part of the course 0HV90 Quantitative Research. At the end of the slides a handy summary 'Research data management basics in a nutshell' is added.
Planning for Research Data Management: 26th January 2016IzzyChad
This document provides an overview of a session on planning for research data management. It discusses what research data management is, why it is important, and walks through the steps for creating a data management plan. The presenter explains the benefits of effective data management, such as helping researchers work more efficiently and enabling data sharing. Key aspects of a data management plan are also outlined, including describing the data, addressing ethics and intellectual property, determining how data will be stored and preserved, and making plans for data sharing and access.
The document provides an overview of the Donders Repository, which aims to securely store original research data, document the research process, and make data accessible to researchers and the public. It describes the procedural design including different roles, collection types, and states. The technical architecture is based on IRODS software and scalable storage. The repository fits into researchers' workflows and supports the timeline of projects from initiation to data sharing. Standards like BIDS help make neuroimaging data FAIR (Findable, Accessible, Interoperable, Reusable).
This slideshow was used in a Preparing Your Research Material for the Future course taught in the Humanities Division, University of Oxford, on 2014-06-09. It provides an overview of some key issues, focusing on the long-term management of data and other research material, including sharing and curation.
Brad Houston presented information on data management plans (DMPs) required by the National Science Foundation (NSF) for grant proposals. He explained that DMPs must describe the data to be collected or generated, how it will be organized and formatted, and how it will be preserved and shared. He emphasized using open standards and preparing metadata to help others understand and find the data. Researchers were advised to consider long-term preservation and to partner with libraries or repositories to ensure access over time. Contact information was provided for those needing assistance developing their DMP.
Rebecca Raworth presented a workshop on research data management. The presentation covered:
- Why research data management plans are important, such as satisfying funder requirements and increasing research efficiency.
- Current requirements for data management plans in Canada.
- Tools for research data management, including Portage for creating data management plans and Dataverse for data storage and access.
- Best practices for organizing, documenting, storing and sharing research data, including using metadata standards, file naming conventions, and choosing appropriate data repositories.
Data Management in the context of Open Science.
Because open access become mandatory for publications and project-funded research data, it is the responsibility of each researcher to be informed and then trained in new practices.
The webinar discussed FAIRDOM services that can help applicants to the ERACoBioTech call with their data management plans and requirements. FAIRDOM offers webinars on developing data management plans, and their platform and tools can help with organizing, storing, sharing, and publishing research data and models in a FAIR manner by utilizing metadata standards. Different levels of support are available, from general community resources through their hub, to premium customized support for individual projects. Consortia can include FAIRDOM as a subcontractor within the guidelines of the ERACoBioTech call.
The document discusses recommendations from a workshop on peer review of research data. It focuses on three key areas:
1. Connecting data review with data management planning by requiring data sharing plans, ensuring adequate funding for data management, and refusing publication without clear data access.
2. Connecting scientific and technical review with data curation by linking articles and data with versioning, avoiding duplicate review efforts, and addressing issues found in data.
3. Connecting data review with article review by requiring methods/software information, providing review checklists, ensuring data access for reviewers, and permanent dataset identifiers from repositories.
Here are the results of the dotmocracy voting:
- "Libraries are the best departments at universities to take on research data archiving." Received the most dots.
- "High cost research facilities should be obliged to share (and preserve) their data." Received the second most dots.
- "Each dataset should also include the data in its rawest form." Received the third most dots.
The top three propositions that received the most votes were:
1. Libraries are the best departments at universities to take on research data archiving.
2. High cost research facilities should be obliged to share (and preserve) their data.
3. Each dataset should also include the data in its
This document provides guidance on managing research data. It discusses planning ahead to consider data needs, formats, and volume. It emphasizes organizing data through file naming, metadata, references, email, and remote access. It stresses preserving data by determining what to keep/delete, using long-term storage such as repositories or archives. Finally, it examines reasons to share data such as scientific integrity, funding mandates, and increasing impact and collaboration.
Using Open Science to advance science - advancing open data Robert Oostenveld
This document discusses using open science practices like open data to advance science. It notes the benefits of open data like improved reproducibility and opportunities for data mining. However, sharing neuroimaging and other human subject data presents challenges regarding data size, sensitivity, and privacy regulations. The document promotes using the Brain Imaging Data Structure (BIDS) format to organize data in an open, standardized way. It also discusses the gradient between personal/identifiable data that requires protection and de-identified research data that can be shared, as well as legal constraints and appropriate repositories for sharing data responsibly.
This document provides an introduction to the National Science Foundation's (NSF) data policies and the Indiana University-Purdue University Indianapolis (IUPUI) University Library's data services program. It summarizes NSF's policies on disseminating and sharing research data, including requirements for submitting a data management plan with grant proposals. The document then outlines best practices for addressing different components of a data management plan, such as describing your data, standards, metadata, access and sharing policies, long-term preservation, and roles and responsibilities. Contact information is provided for the Digital Scholarship and Data Management Librarian for questions.
The document discusses requirements for data management plans from the National Science Foundation. It notes that as of January 2011, NSF will require a data management plan for all new grant proposals as well as existing grants. The plan must address what data will be collected and how it will be organized, preserved, shared, and accessed. It emphasizes the importance of effective data management for facilitating research by both the principal investigators and other researchers. The document provides guidance on developing a data management plan that meets NSF's criteria and effectively manages research data.
This document discusses implementing Linked Data in low resource conditions. It begins by outlining goals of providing a high-level view of Linked Data, identifying possible bottlenecks due to limited resources, and offering suggestions to overcome bottlenecks based on experience. It then defines what is meant by "low-resource conditions", including limited IT competencies, software, hardware, electricity, internet access. The document outlines the Linked Data workflow and discusses each step in more detail, including data generation, conversion to RDF, data storage, maintenance, linking, and exposure. It highlights the example of AGRIS, a collaborative Linked Data application, and emphasizes starting small, being strategic, reusing existing resources, and collaborating to maximize resources in low
FAIRDOM data management support for ERACoBioTech ProposalsFAIRDOM
This document provides information about a webinar from the FAIRDOM Consortium on data management for ERACoBioTech full proposals. It includes:
- Details on how to budget for and include a data management plan in proposals
- A checklist for developing a data management plan covering topics like the types and volumes of data, data sharing and reuse, and making data FAIR
- An overview of the FAIRDOM services and software platform that can help with project data management and stewardship
Mark Baker, president and CEO of the National Aircraft Association (NAA), announced that Sean Gaffney will be stepping down from his role as executive vice president of regulatory and technical affairs at the end of June. Gaffney had worked at NAA for 13 years in roles of increasing responsibility and helped lead several regulatory and legislative initiatives. His replacement has yet to be named by the NAA.
Este documento resume varios temas relacionados con sistemas operativos y software, incluyendo las ventajas de Linux sobre Windows, la historia y características de Android, y descripciones generales de Microsoft Office, hardware, lenguajes de programación, y Mac OS. También incluye una sección de créditos al final.
Mr. Belloit has 25 years of experience in estimating for water and wastewater construction projects. He is skilled in developing estimates, identifying costs, performing bid evaluations, and negotiating contracts. Some of his recent project experience includes the $20 million Jacksonville Beach Wastewater Treatment Plant and the $47 million James Anderson Reverse Osmosis Water Treatment Plant in Port St. Lucie, Florida. He is proficient in WinEst and other estimating software programs.
Npm : 201243501163
Nama : Hamim Suyuti
Kelas : R7H
Mata Kuliah : Komputer Grafik
Dosen : Nahot Frastian , M.Kom
Program Studi : Teknik Informatika
Universitas : Universitas Indraprasta PGRI
On Facing the age-old issue of Illicit Financial Outflows-Vulture Funds, Squa...AYshare
Vulture funds refer to illicit financial flows, or money illegally siphoned out of developing countries. An estimated $443 billion was stolen from Sub-Saharan Africa between 1970-2004, while $50-60 billion continues to be lost every year. This capital flight has severely undermined Africa's ability to finance development and reduced revenues available. While the Addis Ababa Action Agenda aims to curb these flows, illicit financial flows remain a major challenge, with corrupt officials and multinational companies complicit in driving money into tax havens and safe havens. Addressing these issues will be crucial to achieving sustainable development goals and ensuring developing countries can finance growth.
«Как научить Ruby / как научиться Ruby», Виктор Шепелев (Team Lead at BrandSp...Alina Vilk
Тезисы/рассматриваемые вопросы:
Стать рубистом с нуля — как? Самообразование, книги, учителя
Задать направление развития младшему коллеге и подчиненному — как?
Как определить необходимый объем знаний? Как осознать свой уровень?
Когда и зачем мы прекращаем учиться? Зачем повышать уровень?
How to Find a Perfect Employee For Your OrganizationApptunix
The most important thing to know
about the person you are hiring.
Look into the skills and the subjects the
person holds the expertise.
http://apptunix.com
This internship opportunity at WFMJ-TV, a local television station in Youngstown, Ohio, would involve marketing and promoting the station's brands. Responsibilities would include event planning and coordination, developing promotional materials, assisting clients with advertisements, and conducting marketing research. The intern would apply and develop skills in areas like marketing, graphic design, public relations, and communications. Several projects and events are mentioned, such as community outreach activities and press conferences. The document emphasizes that the internship would be an educational experience allowing one to expand their skills in preparation for a future career.
IEW is an industrial engineering consulting firm established in 2015 to provide process analysis, continuous improvement, and training services. The company focuses on process modeling, documentation, and measurements to improve organizational efficiency. IEW's team of industrial engineers use Lean Six Sigma tools and methodologies to design and implement new processes. Their goal is to partner with clients to achieve goals through customized industrial engineering solutions.
This dissertation examines how key financial value drivers can impact shareholder value and the proper management of these drivers to maximize value. It contrasts a shareholder value maximization approach with a profit maximization view. Various value creation metrics are analyzed, including discounted cash flow valuation and economic profit. Strategies for different companies are evaluated based on how they affect value drivers and shareholder value. Historical performance is compared across companies and potential value-destroying factors are considered. The goal is to determine how value drivers can be best managed to create long-term shareholder value.
FORTUNE SQUARE strategically located at Rathinamangalam, near Tambaram, one of the most promising and rapidly appreciating residential destination of Chennai. It profits from a distinctively prominent setting that makes it a smart and valuable investment with guaranteed high returns. Close proximity to well-known education and commercial establishments, popular leisure hubs and public transportation.
Este documento describe un ensayo metalográfico realizado en el laboratorio para determinar la estructura microscópica de una probeta de acero F-1150 compuesta por un 0.55% de carbono. Explica los procesos de lijado, pulido y ataque químico utilizados, así como los cálculos para determinar que la probeta contiene un 61.79% de perlita y un 38.20% de ferrita. Finalmente, incluye fotografías tomadas al microscopio de los resultados obtenidos.
Siemcom is an IT systems integrator based in Abu Dhabi with a presence in Dubai, KSA, and Qatar. It was formed over a decade ago by professionals experienced in ICT solutions. Siemcom provides a range of ICT solutions and services including voice, data, infrastructure, and professional services to simplify customers' business communications at affordable prices. It partners with leading technology companies and has a strong customer support system.
Target Audience: Public
This project basically focuses on financing for development to achieve the sustainable development goals which expand on the millennium development goals.
Le influenze della globalizzazione sulle scelte di politica criminale. L’evol...Federico Cappelletti
GIORNATE DI STUDIO TRIVENETE, organizzate dal Consiglio nazionale Forense con l'Unione Triveneta degli Ordini degli Avvocati e l'Unione delle Camere Penali Italiane - Treviso, 4 novembre 2016
Il diritto penale che si adegua alla nuove istanze di criminalizzazione, il ruolo dell’avvocato nel processo di adeguamento anche attraverso la difesa d’ufficio e il patrocinio a spese dello Stato
El documento describe varios aparatos inventados entre 1824 y 1895 para crear la ilusión de imágenes en movimiento antes del cine, como el mutoscopio, la linterna Kinora, el fenaquistiscopio, la linterna mágica, el praxinoscopio, el kinematoscopio y el traumatropo. Cada uno se basaba en principios ópticos como la persistencia retiniana para mostrar rápidamente una serie de imágenes fijas y crear la apariencia de movimiento.
This document summarizes a training course on research data management for librarians. The course covers key topics like what research data is, data management planning, data sharing, skills needed to support research data management, and how libraries can play a role in supporting RDM at their institution. The training includes presentations, exercises, and discussions to help librarians understand research data issues and ways they can provide services to support researchers with managing and sharing their data.
The document discusses NSF requirements for data management plans for grant proposals. It notes that as of January 2011, proposals must include a data management plan that addresses how data will be organized, preserved, and shared. The plan must provide enough detail for reviewers to understand how data will be managed during and after the project. Guidelines are provided on the key elements to address in a data management plan, including what data will be collected, how it will be formatted and documented, how others can access and use the data, and how the data will be preserved long-term. Resources for developing effective data management plans are suggested.
The document discusses requirements for National Science Foundation (NSF) Data Management Plans (DMPs). Starting in 2011, DMPs describing how research data will be organized, preserved, and shared are required as part of NSF grant proposals. DMPs must address data standards, access and sharing policies, and long-term preservation and access. Resources for writing DMPs are provided, including tools, best practices examples, and experts available for consultation.
This presentation was delivered at IT Services, University of Oxford on 2014-05-28, as part of the 'Things To Do With Data' series of lunchtime talks. It offers an overview of resources available for management and support staff whose responsibilities include planning and implementing data management strategies.
This document discusses research lifecycles and data management. It begins by outlining typical stages in a research lifecycle from planning to publication. It then discusses how data is created and managed at various stages, and raises questions researchers should consider around formatting, documenting, storing, sharing and preserving data. The document provides examples of research lifecycle models and gives advice on best practices for managing data at each stage of the research process to support reuse and ensure data is well documented and preserved.
Rebecca Raworth presented a workshop on research data management. The presentation covered:
- Why research data management plans are important, such as satisfying funder requirements and increasing research efficiency.
- Current requirements for data management plans in Canada.
- Tools for research data management, including Portage for creating data management plans and Dataverse for data storage and access.
- Best practices for organizing, documenting, storing and sharing research data, including using metadata standards, file naming conventions, and choosing appropriate data repositories.
Managing data throughout the research lifecycleMarieke Guy
This document summarizes a presentation about managing data throughout the research lifecycle. It discusses the stages of the research lifecycle, including planning, data creation, documentation, storage, sharing, and preservation. It provides examples of research lifecycle models and addresses key questions to consider at each stage, such as what formats to use, how to document data, where to store it, and how to share and preserve it. The presentation emphasizes making informed decisions about data management and talking to colleagues for support and advice.
This document discusses data management practices for researchers. It defines what constitutes data, such as observations, experiments, simulations, and documents. It outlines the roles of librarians in advising on data management plans, metadata practices, and archiving data. It also discusses why data management is important for validation, replication of research, and compliance with funder requirements. The document provides examples of file structures, naming conventions, metadata, codebooks, and archiving data in institutional repositories to facilitate long-term access and reuse of research data.
This slideshow was used in a Research Data Management Planning course taught at IT Services, University of Oxford, on 2015-11-04. It provides an overview of the elements of a data management plan, plus an introduction to some tools that can be used to build one.
The document provides an introduction to research data management planning, explaining what a data management plan is, what it should include, and tools and resources available for creating a plan. It discusses the key components of a data management plan such as describing the project and data, handling the data during the project, documentation, long-term preservation, and meeting requirements. Finally, it provides examples of planning tools and resources for developing a data management plan.
Data Management for Postgraduate students by Lynn Woolfreypvhead123
This document discusses research data management for postgraduates. It explains that research data management refers to storing, accessing, and preserving research data. It notes that funders and universities now require data management plans for funding proposals and research. The document provides reasons for doing research data management, such as ensuring long-term data preservation, preventing fraud, and enabling data reuse. It outlines elements to include in a data management plan and resources for writing plans. The document advises that data services can help take the burden of research data management off researchers.
This slideshow was used in an Introduction to Research Data Management course taught for the Mathematical, Physical and Life Sciences Division, University of Oxford, on 2014-02-26. It provides an overview of some key issues, looking at both day-to-day data management, and longer term issues, including sharing, and curation.
This document discusses research data management (RDM). It defines research data and describes the RDM lifecycle. Key aspects of RDM include creating data management plans, documenting and organizing data, and ensuring long-term preservation and sharing of data. The document outlines best practices for RDM, such as using appropriate file formats and metadata standards. It also discusses challenges around sensitive data and guidelines for data sharing and citation. The roles libraries can play in supporting RDM are identified, such as developing RDM policies, training researchers, and setting up data repositories.
PPT on Alternate Wetting and Drying presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.
CLASS 12th CHEMISTRY SOLID STATE ppt (Animated)eitps1506
Description:
Dive into the fascinating realm of solid-state physics with our meticulously crafted online PowerPoint presentation. This immersive educational resource offers a comprehensive exploration of the fundamental concepts, theories, and applications within the realm of solid-state physics.
From crystalline structures to semiconductor devices, this presentation delves into the intricate principles governing the behavior of solids, providing clear explanations and illustrative examples to enhance understanding. Whether you're a student delving into the subject for the first time or a seasoned researcher seeking to deepen your knowledge, our presentation offers valuable insights and in-depth analyses to cater to various levels of expertise.
Key topics covered include:
Crystal Structures: Unravel the mysteries of crystalline arrangements and their significance in determining material properties.
Band Theory: Explore the electronic band structure of solids and understand how it influences their conductive properties.
Semiconductor Physics: Delve into the behavior of semiconductors, including doping, carrier transport, and device applications.
Magnetic Properties: Investigate the magnetic behavior of solids, including ferromagnetism, antiferromagnetism, and ferrimagnetism.
Optical Properties: Examine the interaction of light with solids, including absorption, reflection, and transmission phenomena.
With visually engaging slides, informative content, and interactive elements, our online PowerPoint presentation serves as a valuable resource for students, educators, and enthusiasts alike, facilitating a deeper understanding of the captivating world of solid-state physics. Explore the intricacies of solid-state materials and unlock the secrets behind their remarkable properties with our comprehensive presentation.
When I was asked to give a companion lecture in support of ‘The Philosophy of Science’ (https://shorturl.at/4pUXz) I decided not to walk through the detail of the many methodologies in order of use. Instead, I chose to employ a long standing, and ongoing, scientific development as an exemplar. And so, I chose the ever evolving story of Thermodynamics as a scientific investigation at its best.
Conducted over a period of >200 years, Thermodynamics R&D, and application, benefitted from the highest levels of professionalism, collaboration, and technical thoroughness. New layers of application, methodology, and practice were made possible by the progressive advance of technology. In turn, this has seen measurement and modelling accuracy continually improved at a micro and macro level.
Perhaps most importantly, Thermodynamics rapidly became a primary tool in the advance of applied science/engineering/technology, spanning micro-tech, to aerospace and cosmology. I can think of no better a story to illustrate the breadth of scientific methodologies and applications at their best.
Anti-Universe And Emergent Gravity and the Dark UniverseSérgio Sacani
Recent theoretical progress indicates that spacetime and gravity emerge together from the entanglement structure of an underlying microscopic theory. These ideas are best understood in Anti-de Sitter space, where they rely on the area law for entanglement entropy. The extension to de Sitter space requires taking into account the entropy and temperature associated with the cosmological horizon. Using insights from string theory, black hole physics and quantum information theory we argue that the positive dark energy leads to a thermal volume law contribution to the entropy that overtakes the area law precisely at the cosmological horizon. Due to the competition between area and volume law entanglement the microscopic de Sitter states do not thermalise at sub-Hubble scales: they exhibit memory effects in the form of an entropy displacement caused by matter. The emergent laws of gravity contain an additional ‘dark’ gravitational force describing the ‘elastic’ response due to the entropy displacement. We derive an estimate of the strength of this extra force in terms of the baryonic mass, Newton’s constant and the Hubble acceleration scale a0 = cH0, and provide evidence for the fact that this additional ‘dark gravity force’ explains the observed phenomena in galaxies and clusters currently attributed to dark matter.
The binding of cosmological structures by massless topological defectsSérgio Sacani
Assuming spherical symmetry and weak field, it is shown that if one solves the Poisson equation or the Einstein field
equations sourced by a topological defect, i.e. a singularity of a very specific form, the result is a localized gravitational
field capable of driving flat rotation (i.e. Keplerian circular orbits at a constant speed for all radii) of test masses on a thin
spherical shell without any underlying mass. Moreover, a large-scale structure which exploits this solution by assembling
concentrically a number of such topological defects can establish a flat stellar or galactic rotation curve, and can also deflect
light in the same manner as an equipotential (isothermal) sphere. Thus, the need for dark matter or modified gravity theory is
mitigated, at least in part.
The debris of the ‘last major merger’ is dynamically youngSérgio Sacani
The Milky Way’s (MW) inner stellar halo contains an [Fe/H]-rich component with highly eccentric orbits, often referred to as the
‘last major merger.’ Hypotheses for the origin of this component include Gaia-Sausage/Enceladus (GSE), where the progenitor
collided with the MW proto-disc 8–11 Gyr ago, and the Virgo Radial Merger (VRM), where the progenitor collided with the
MW disc within the last 3 Gyr. These two scenarios make different predictions about observable structure in local phase space,
because the morphology of debris depends on how long it has had to phase mix. The recently identified phase-space folds in Gaia
DR3 have positive caustic velocities, making them fundamentally different than the phase-mixed chevrons found in simulations
at late times. Roughly 20 per cent of the stars in the prograde local stellar halo are associated with the observed caustics. Based
on a simple phase-mixing model, the observed number of caustics are consistent with a merger that occurred 1–2 Gyr ago.
We also compare the observed phase-space distribution to FIRE-2 Latte simulations of GSE-like mergers, using a quantitative
measurement of phase mixing (2D causticality). The observed local phase-space distribution best matches the simulated data
1–2 Gyr after collision, and certainly not later than 3 Gyr. This is further evidence that the progenitor of the ‘last major merger’
did not collide with the MW proto-disc at early times, as is thought for the GSE, but instead collided with the MW disc within
the last few Gyr, consistent with the body of work surrounding the VRM.
(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...Scintica Instrumentation
Targeting Hsp90 and its pathogen Orthologs with Tethered Inhibitors as a Diagnostic and Therapeutic Strategy for cancer and infectious diseases with Dr. Timothy Haystead.
Travis Hills of MN is Making Clean Water Accessible to All Through High Flux ...Travis Hills MN
By harnessing the power of High Flux Vacuum Membrane Distillation, Travis Hills from MN envisions a future where clean and safe drinking water is accessible to all, regardless of geographical location or economic status.
Microbial interaction
Microorganisms interacts with each other and can be physically associated with another organisms in a variety of ways.
One organism can be located on the surface of another organism as an ectobiont or located within another organism as endobiont.
Microbial interaction may be positive such as mutualism, proto-cooperation, commensalism or may be negative such as parasitism, predation or competition
Types of microbial interaction
Positive interaction: mutualism, proto-cooperation, commensalism
Negative interaction: Ammensalism (antagonism), parasitism, predation, competition
I. Mutualism:
It is defined as the relationship in which each organism in interaction gets benefits from association. It is an obligatory relationship in which mutualist and host are metabolically dependent on each other.
Mutualistic relationship is very specific where one member of association cannot be replaced by another species.
Mutualism require close physical contact between interacting organisms.
Relationship of mutualism allows organisms to exist in habitat that could not occupied by either species alone.
Mutualistic relationship between organisms allows them to act as a single organism.
Examples of mutualism:
i. Lichens:
Lichens are excellent example of mutualism.
They are the association of specific fungi and certain genus of algae. In lichen, fungal partner is called mycobiont and algal partner is called
II. Syntrophism:
It is an association in which the growth of one organism either depends on or improved by the substrate provided by another organism.
In syntrophism both organism in association gets benefits.
Compound A
Utilized by population 1
Compound B
Utilized by population 2
Compound C
utilized by both Population 1+2
Products
In this theoretical example of syntrophism, population 1 is able to utilize and metabolize compound A, forming compound B but cannot metabolize beyond compound B without co-operation of population 2. Population 2is unable to utilize compound A but it can metabolize compound B forming compound C. Then both population 1 and 2 are able to carry out metabolic reaction which leads to formation of end product that neither population could produce alone.
Examples of syntrophism:
i. Methanogenic ecosystem in sludge digester
Methane produced by methanogenic bacteria depends upon interspecies hydrogen transfer by other fermentative bacteria.
Anaerobic fermentative bacteria generate CO2 and H2 utilizing carbohydrates which is then utilized by methanogenic bacteria (Methanobacter) to produce methane.
ii. Lactobacillus arobinosus and Enterococcus faecalis:
In the minimal media, Lactobacillus arobinosus and Enterococcus faecalis are able to grow together but not alone.
The synergistic relationship between E. faecalis and L. arobinosus occurs in which E. faecalis require folic acid
PPT on Direct Seeded Rice presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.
JAMES WEBB STUDY THE MASSIVE BLACK HOLE SEEDSSérgio Sacani
The pathway(s) to seeding the massive black holes (MBHs) that exist at the heart of galaxies in the present and distant Universe remains an unsolved problem. Here we categorise, describe and quantitatively discuss the formation pathways of both light and heavy seeds. We emphasise that the most recent computational models suggest that rather than a bimodal-like mass spectrum between light and heavy seeds with light at one end and heavy at the other that instead a continuum exists. Light seeds being more ubiquitous and the heavier seeds becoming less and less abundant due the rarer environmental conditions required for their formation. We therefore examine the different mechanisms that give rise to different seed mass spectrums. We show how and why the mechanisms that produce the heaviest seeds are also among the rarest events in the Universe and are hence extremely unlikely to be the seeds for the vast majority of the MBH population. We quantify, within the limits of the current large uncertainties in the seeding processes, the expected number densities of the seed mass spectrum. We argue that light seeds must be at least 103 to 105 times more numerous than heavy seeds to explain the MBH population as a whole. Based on our current understanding of the seed population this makes heavy seeds (Mseed > 103 M⊙) a significantly more likely pathway given that heavy seeds have an abundance pattern than is close to and likely in excess of 10−4 compared to light seeds. Finally, we examine the current state-of-the-art in numerical calculations and recent observations and plot a path forward for near-future advances in both domains.
Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...Sérgio Sacani
Context. The observation of several L-band emission sources in the S cluster has led to a rich discussion of their nature. However, a definitive answer to the classification of the dusty objects requires an explanation for the detection of compact Doppler-shifted Brγ emission. The ionized hydrogen in combination with the observation of mid-infrared L-band continuum emission suggests that most of these sources are embedded in a dusty envelope. These embedded sources are part of the S-cluster, and their relationship to the S-stars is still under debate. To date, the question of the origin of these two populations has been vague, although all explanations favor migration processes for the individual cluster members. Aims. This work revisits the S-cluster and its dusty members orbiting the supermassive black hole SgrA* on bound Keplerian orbits from a kinematic perspective. The aim is to explore the Keplerian parameters for patterns that might imply a nonrandom distribution of the sample. Additionally, various analytical aspects are considered to address the nature of the dusty sources. Methods. Based on the photometric analysis, we estimated the individual H−K and K−L colors for the source sample and compared the results to known cluster members. The classification revealed a noticeable contrast between the S-stars and the dusty sources. To fit the flux-density distribution, we utilized the radiative transfer code HYPERION and implemented a young stellar object Class I model. We obtained the position angle from the Keplerian fit results; additionally, we analyzed the distribution of the inclinations and the longitudes of the ascending node. Results. The colors of the dusty sources suggest a stellar nature consistent with the spectral energy distribution in the near and midinfrared domains. Furthermore, the evaporation timescales of dusty and gaseous clumps in the vicinity of SgrA* are much shorter ( 2yr) than the epochs covered by the observations (≈15yr). In addition to the strong evidence for the stellar classification of the D-sources, we also find a clear disk-like pattern following the arrangements of S-stars proposed in the literature. Furthermore, we find a global intrinsic inclination for all dusty sources of 60 ± 20◦, implying a common formation process. Conclusions. The pattern of the dusty sources manifested in the distribution of the position angles, inclinations, and longitudes of the ascending node strongly suggests two different scenarios: the main-sequence stars and the dusty stellar S-cluster sources share a common formation history or migrated with a similar formation channel in the vicinity of SgrA*. Alternatively, the gravitational influence of SgrA* in combination with a massive perturber, such as a putative intermediate mass black hole in the IRS 13 cluster, forces the dusty objects and S-stars to follow a particular orbital arrangement. Key words. stars: black holes– stars: formation– Galaxy: center– galaxies: star formation
2. Blackboard
UT website, employees page
ORG-AA-BA-RESDATAMAN: Course Research Data Management
Course material: presentations, links to information, DMP template,
datasets
After the course-day: contact for support and feedback
3. Why research data management
• Importance of quality, reliability, replicability and
verification of scientific research
• Better and more efficient access to research data
• Requirements of research funders with regard to data
management
• Data management will become an issue in research
assessments
4. Benefits research data management
• Improved research quality
• Improved efficiency
• Protection from data-related risks
• Enhanced reputation and prestige
5. Research Data Management: importance (1/2)
Scientific integrity (1), funder requirements (2) and developments in science
(3)
(1) Fabrication, Falsification and Plagiarism (FFP) > RDM?
Neglect of basic preservation of data
Neglect of data management
No proper mechanism for quality control: no data or instruments
for easy data reproduction means no possible check
See also:
https://www.utwente.nl/en/organization/structure/management/good-management/
Netherlands Code of Conduct for Academic Practice: Verification section
6. Research Data Management: importance (2/2)
(2) NWO and EU Horizon 2020 data management pilots
Focus on open data and reuse
Data Management Plan
Data archived in data repository
NWO: http://www.nwo.nl/en/policies/open+science/data+management
EU H2020:
http://ec.europa.eu/research/participants/data/ref/h2020/grants_manual/hi/oa_pi
(3) Development in science
Data intensive science (4th
paradigm)
Data collections are future assets of research groups
7. What you will learn today
Data management planning: how to make a DMP, what issues and
how to describe (interactive)
Awareness of importance of managing data after research: data
citation and publication (persistent identifiers) and proper data
archiving
Knowledge about legal issues in data management
8. Programme
9:30 Introduction to Research Data Management Dr. ir. Maarten van Bentum, data librarian
UT - Library & Archive
9:45 Data Management Planning Dr. ir. Maarten van Bentum, data librarian
UT - Library & Archive
10:00 Small group assignment:
Writing a DMP section (based on one of the
research cases in the group)
Dr. ir. Maarten van Bentum, data librarian
UT - Library & Archive
10:45 Break
11:00 Plenary presentations: Each group presents the
section they have prepared, and rest of the teams
act as the EU review committee.
Dr. ir. Maarten van Bentum, data librarian
UT - Library & Archive
12:30 Lunch
13:30 Data Citation: Claiming Data with DOI’s (incl. small
assignments)
Ellen Verbakel, data librarian TU Delft -
3TU.Datacentrum
14:00 Hands on Data CV, ORCID (participants individually) Ellen Verbakel, data librarian TU Delft -
3TU.Datacentrum
14:45 Data publications Ellen Verbakel, data librarian TU Delft -
3TU.Datacentrum
15:00 Break
15:15 Data archive, Dataseal, DIY/DIT Ellen Verbakel, data librarian TU Delft -
3TU.Datacentrum
15:30 Legal issues: Data retention, data protection,
privacy, ownership
Drs. Heiko Tjalsma, legal advisor DANS
16:30 Evaluation form: tell us what you think about this
course
16:45 Closure
9. Data Management Plan – a definition
Formal research project document about what and how data will be
collected, stored, described, and archived and how access, reuse and
linking to publications will be realised.
10. Data Management Plan - topics
Responsibility
Description of data
Methodology data collection
Documentation: metadata (standards)
Quality assurance
Storage and backup
Policies for access and sharing and provisions for appropriate
protection/privacy
Policies and provisions for reuse, redistribution
Plans for archiving and preservation of access
From: National Science Foundation and University of California
11. Data Management Plan - templates
Information, templates and checklists
UT template: website RDM on Library & Archive
3TU.Datacentrum: template
DANS checklist
NWO form
12. Writing a DMP
6 small groups (data collection, data storage and backup, data
documentation, data access, data sharing and reuse, data preservation
and archiving)
Use UT template
Work with research case or dataset of one of the group members
Plenary presentations and discussion (15 min each)
13. DMP - Data collection (1/1)
Type of data > what else should be considered to be object for management:
software, models, scripts, instruments, questionnaires, informed consent, etc.
Legal and contractual regulations: Personal data? >
Dutch Personal Data Protection Act,
http://www.utwente.nl/az/gegevensbescherming/ (in Dutch)
UT classification guideline for information and information systems (in
Dutch)
Who collects data: third party? > contract about rights and licenses, example
bankruptcy research agency (see later: data access)
14. DMP - Data storage and backup (1/4)
Criteria
Sustainability/reliability: frequency backup (off line / off site?)
Dataset type: raw dataset, versions during processing and analysis, final
datasets
Size dataset: capacity, costs, data transfer
Legal or contractual regulations
Access: individual, community, open
15. DMP - Data storage and backup (2/4)
Storage options
1.UT central storage
p- or m-disk (ICTS): http
://www.utwente.nl/icts/diensten/catalogus/dataopslag_mw/storage/)
1.Project, community or research institute storage
IGS Datalab: https://www.utwente.nl/igs/datalab/
§Individual data storage (computer, dvd/cd, external hard disk,…)
§Non-commercial cloud storage
Surfdrive: https://www.surfdrive.nl/en
DataverseNL: https://dataverse.nl/dvn/
§Commercial cloud storage: Dropbox, OneDrive, …
16. DMP - Data storage and backup (3/4)
Storage solution Advantages Disadvantages Suitable for
University of Twente
(ICTS) central storage
M: and P:
full service; reliable,
durable, secure; high
speed data transfer
no sharing outside UT saving large data files; master
copy of data; use encryption for
sensitive and critical data; use
SURFfilesender for encrypted
data transfer
PC or laptop always available;
portable; low cost;
high speed data
transfer
sensitive to damage and
loss (no automatic
backup); no sharing
saving large data files; temporary
storage; use encryption for
sensitive and critical data
Personal storage
devices (USB flash,
external hard drive,
DVD/CD)
portable; low cost easily damaged or lost
(no automatic backup);
not for sensitive or
critical data; difficult
sharing
saving large data files; temporary
storage of standard data
Non-commercial cloud
services (for example,
DataverseNL1
,
SURFdrive)
automatic
synchronization on
several devices; easy
access; external
sharing
medium speed data
transfer; not for
sensitive or critical data
(SURFDrive: when
encrypted)
sharing standard data with
external parties
Commercial cloud
services (for example,
Dropbox, Google Drive,
OneDrive)
automatic
synchronization on
several devices; easy
access; external
sharing
medium speed data
transfer; not for
sensitive or critical data;
unclear access to data;
unclear privacy
regulations
sharing standard data with
external parties
17. DMP - Data storage and backup (4/4)
UT data policy
During the research the research data will be saved in a central
repository which is available to at least the members of the research
group/ institute and which is managed by this research group/ institute.
Storage and access should be managed in accordance with legal
regulations, any third party contractual requirements, etc.
Backup
3 copies (original, external/local, external/remote)
Local vs. remote depends on recovery time needed
Data transfer
https://www.utwente.nl/icts/en/diensten/catalogus/filesender/
18. DMP - Data documentation (1/4)
Documentation during research of dynamic data sets (for yourself,
fellow researchers in the project and/or group)
Documentation after research of static data sets (for discovery,
verification, replication, and reuse)
Documentation: standard metadata schemes enhanced with specific
descriptive elements necessary for verification, replication, and reuse
See list: http://www.dcc.ac.uk/resources/metadata-standards/list
See also 3TU.Datacentrum Data description and formats
19. DMP - Data documentation (2/4)
Title name of the dataset or research project that produced it
Creator names and addresses of the organization or people who created the
data, including all significant contributors
Identifier The identification number used to identify the data, even if it’s just
an internal project reference number
Subject keywords or phrases describing the subject or content of the data
Dates key dates associated with the data, including:
project start and end date; release date;
other dates associated with the data lifespan, e.g., maintenance
cycle, update schedule
Funders organizations or agencies who funded the research
Language language(s) of the intellectual content of the resource, when
relevant
Location where the data relates to a physical location, record information
about its spatial coverage
Rights description of any known intellectual property rights held for the data
List of file names and relationships list of all digital files in the archive, with
their names and file extensions (e.g., 'NWPalaceTR.WRL', 'stone.mov')
20. DMP - Data documentation (3/4)
Formats format(s) of the data, e.g., FITS, SPSS, HTML, JPEG
Methodology how the data was generated, including equipment or software
used, experimental protocol, other things you would include in your lab
notebook. Can reference a published article, if it covers everything
Workflows or analyses to be able to reproduce your work
Sources references to source material for data derived from other sources,
including details of where the source data is held, how identified and
accessed
Versions date/time stamped, and use a separate ID (e.g., version number) for
each version
Checksums to test if your file has changed over time
Explanation of codes used in file names brief explanation of any naming
conventions or abbreviations used to label the files
List of codes used in files list of any special values used in the data (e.g.,
codes for categorical survey responses, '999 indicates a "dummy" value in
the data,' etc.)
Store metadata in a text file (such as a readme file or codebook) in the
same directory as the data
21. DMP - Data documentation (4/4)
File naming conventions: http://guides.lib.purdue.edu/content.php?
pid=440001&sid=4901667
Good directory structure:
Directory top-level should include
Project title
Unique identifier
Date (e.g. year)
Substructure should have clear, documented naming convention
e.g. each run of an experiment, each version of a dataset, each person
in the group.
22. DMP - Data access (1/3)
- UT data policy?
- Funder requirements?
- Requirements other parties? Contracts?
- Open Access required? Possible? Dutch Personal Data Protection
Act (UT Data Protection Officer)
23. DMP - Data access (2/3)
data access
M:drive (Home-
directory)
P:drive (Group-
permissions)
DataverseNL Surfdrive
Commercial cloud
(Dropbox, etc)
internal group/organization no yes yes yes yes
external group/organization no no yes yes yes
on request no no yes no no
view/download rights management no yes yes yes yes
edit rights management no yes yes yes yes
collaborating on data no no yes yes yes
24. DMP - Data access (3/3)
DataverseNL
dynamic data sets (file version control)
static data sets (release with persistent id)
access rights management
not for privacy sensitive data!
25. DMP - Data sharing and reuse (1/1)
Why sharing your data?
Replication / verification
Promote your research
Enable new discoveries (reuse)
"Open where possible, protected where needed"
See NWO policy http://www.nwo.nl/en/policies/open+science
After research: public, linked to publication(s) > DataverseNL, data
centres
26. DMP - Data preservation and archiving (1/2)
UT data policy
Preferably during the research, but not later than 1 month after
finishing the research, the research data are archived in a trusted
repository (e.g. DANS or 3TU.Datacentrum). The research data
are, taking legal regulations, any third party contractual conditions
into account, preferably publicly available. This covers at least the
research data that form the basis of publications about the
research, but can also comprise the full set of raw and/or edited
research data.
After the research all durably stored research data and the
publications based on those data are linked. This is at least the
case for PhD dissertations.
27. DMP – Data preservation and archiving (2/2)
Data centres:
3TU.Datacentrum
DANS
List of data repositories: Databib or Data repositories
Editor's Notes
General reasons for more attention to RDM
Specific benefits of good RDM
Costs time in the beginning, saves time in the end and overall
Data loss, data corruption, unauthorized access (confidential data, privacy, …)
Good to show that your research is based in proper data creation and handling and that partly because of that can be replicated. Some remarks: Data as reference material.
Although still underestimated: when data are linked to publication, it raises the value of that publication (more journals require data with the publication). Data in itself can be seen as output, data journals.
Data management needed for these reasons (integrity) but also for other (scientific) users, obligation of funders (OA), and other reasons.
To avoid any doubts on scientific integrity: in general good practice, but some bad practices.
Criteria: Fabrication, Falsification and Plagiarism (FFP)
Fabrication of data (Stapel, Schön)
Untraceable data (Poldermans)
Neglect of basic preservation of data
Neglect of data management
No proper mechanism for quality control: no data or instruments for easy data reproduction means no possible check
NWO pilot
from 1-1-2015,
7 rounds of funding
Data management section (based on 4 questions) followed by DMP after awarded funding
EU H2020
from 1-1-2014
7 research areas
Data Management Plan required within six months after project grant
Deposit in a research data repository
Opting out of the pilot is possible when motivated
DMP regarded as living document
Data intensive research: New type of research (research without any lab/field time and more data than we can analyze)
Learn from data management practice from other researchers in different scientific fields.
Learn different solutions, in many cases not standard
Also term data curation is often mentioned. This is broader than data management. It covers also the technical part of data handling both during and after the research. For instance how data centres handle data during preservation. It is therefore less suitable for describing the handling of data by the researcher.
Good management starts with a data management plan.
person responsible for data management within your research project
description of the data and the methods used to collect or create the data
how data will be documented throughout the research project
how data quality will be assured
backup procedures
how data will be made available for public use and potential secondary uses
preservation plans
any exceptional arrangements that might be needed to protect participant confidentiality or intellectual property
UT data classification guideline (only in Dutch: informatiebeveiliging, classificatierichtlijn informatie en informatiesystemen)
See also access
(see also guidance DMP)
In general you need 3 copies : original, external/local, external/remote
Dataverse: come back later to that with data access
Surf filesender: encrypted,
There are subject-based metadata schemes, but these even may be to generic for your data.
- Who decides? Can IP on data be claimed? Does any party claim IP?
UT data policy: no statement about ownership
What if other data collection is done by specific organisation…(bankrupt? > curator?)
This afternoon more in presentation on legal issues
Hosted by DANS, data archive for social sciences and humanities
Store, describe data sets and give selected access
Keep all versions?
Just final version?
First and last?
DANS and 3TU.Datacentrum: Data seal of approval
Question to estimate costs: no tariff structure yet, 4,5 euro/GB. Invoice to university, how this will be passed on to research project is not clear yet.
More about data archiving, data citation, etc. in afternoon session